随笔分类 - 技术
1
About Salary
摘要:1. Different types of neurons Linear neurons Binary threshold neurons Recitified linear neurons sigmoid neurons Stochastic binary neurons 2. Reinforce
阅读全文
摘要:1. Subplots Output: 2 .Histogram Output: Output: 3. Box plots Output: 4. Heartmap Output: 5. Animation Output: 6. Interactivity Mousing clickigng Outp
阅读全文
摘要:1. Matplotlib Backend Layer Deals with th e rendering of plots to screen or files In jupyter notebooks, we use the inline backend Artist Layer Contain
阅读全文
摘要:1. Reading csv File 2. Dates and times demo # Numpy introduction Output 1 [1 2 3] 2 (2L, 3L) 3 4 The use of arange: 5 [ 0 2 4 6 8 10 12 14 16 18 20 22
阅读全文
摘要:1. Hierarchical clustering Avoid choosing number of clusters beforehand Dendrograms help visualize different clustering granularities (no need to reru
阅读全文
摘要:1. Mixed membership model This model wants to discover a set of memberships In contrast, cluster models aim at discovering a single membership In clus
阅读全文
摘要:1. Probabilistic clustering model (k-means) Hard assignments do not tell the full story, capture the uncertainty k-means only considers the cluster ce
阅读全文
摘要:,1.One nearest neighbor Input: Query article: Xq Corpus of documents (N docs): (X1, X2, X3,... ,XN) output : XNN = min disance(Xq, Xi) 2. K-NN Algorit
阅读全文
摘要:1. Precisoin and recall precision is how precise i am at showing good stuff on my website recall is how good i am at find all the postive reviews prec
阅读全文
摘要:1. Ensemble classifier Each classifier votes on prediction Ensemble model = sign(w1f1(xi) + w2f2(xi) + w3f3(xi)) w1 w2 w3 is the learning coefficients
阅读全文
摘要:1. Quality metric Quality metric for the desicion tree is the classification error error=number of incorrect predictions / number of examples 2. Greed
阅读全文
摘要:1. Linear classifier It will use training data to learn a weight or coefficient for each word. We use the gradient ascent to find the best model with
阅读全文
摘要:1. Fit locally If the true model changes much, we want to fit our function locally to different regions of the input space. 2. Scaled distance \ we pu
阅读全文
摘要:1. Feature selection Sometimes, we need to decrease the number of features Efficiency: With fewer features, we can compute quickly Interpretaility: wh
阅读全文
摘要:1. Ridge regression A way to automatically balance between bias and varaince situations and regulate overfitting when using many features. because the
阅读全文
摘要:1. Training Error Define a loss funtion like below: and the train error is defined as theaverage loss on houses in training set: and RMSE is simply th
阅读全文
摘要:1. Modeling seasonality w1 models the linear trend of the overall process. w2 models the seasonal component sinusoid with a period of 12 and you do no
阅读全文
摘要:1. Convex and concave functions Concave is the upside-down of the convex function and convex is a bow-shaped function 2. Stepsize common choice: as th
阅读全文
摘要:1. Multiple Features note:X0 is equal to 1 2. Feature Scaling Idea: make sure features are on a similiar scale, approximately a -1<Xi<1 range For exam
阅读全文
摘要:1.Linear Regression with One variable Linear Regression is supervised learning algorithm, Because the data set is given a right answer for each exampl
阅读全文
1
浙公网安备 33010602011771号