Welcome to My Blog
hands on: 15 sequence hands on: 15 sequence
RNNRN: recurrent neuron RNN has connections pointing backward. At time step t, RN receives the inputs $x_t$ as well as i
hands on: 16 nlp hands on: 16 nlp
Generate Text with Character RNNEncode every character as an integer. Encode the full text so each character is represen
hands on: 08 dimension reduction hands on: 08 dimension reduction
In some cases, reducing the dimensionality of the training data may filter out some noise and unnecessary details and th
hands on: 09 unsupervised hands on: 09 unsupervised
ClusteringIdentify similar instances and assign them to clusters. Application customer segmentation Cluster customers b
hands on: 14 cnn hands on: 14 cnn
Conv LayersIn turn, each neuron in the second conv layer is connected only to neurons located within a small rectangle i
hands on: 06 decision tree hands on: 06 decision tree
Train and Visualize a Decision TreeTry to understand how it makes predictions. from sklearn.tree import DecisionTreeCla
hands on: 05 svm hands on: 05 svm
Linear SVM ClassificationSVM are sensitive to the feature scales. Soft Margin ClassificationIf we strictly impose that
hands on: 13 preprocess hands on: 13 preprocess
Data APILoad data from disk. dataset = tf.data.Dataset.from_tensor_slices((df.values, target.values)) Just use the belo
hands on: 04 loss hands on: 04 loss
Linear RegressionThe normal equation. $w = (X^TX)^{-1}X^Ty$ from sklearn.linear_model import LinearRegression lin_reg
hands on: 02 e2e ml hands on: 02 e2e ml
A piece of info fed to a ML system is called a signal. We want a high signal-to-noise ratio. A sequence of data process
hands on: 17 autoencoder hands on: 17 autoencoder
Performing PCA with an Undercomplete Linear AutoencoderThe following code builds a simple linear autoencoder to perform
1 / 2