随笔分类 -  技术

1
About Salary
摘要:1. Different types of neurons Linear neurons Binary threshold neurons Recitified linear neurons sigmoid neurons Stochastic binary neurons 2. Reinforce 阅读全文
posted @ 2017-07-01 09:52 ClimberClimb 阅读(125) 评论(0) 推荐(0)
摘要:1. Subplots Output: 2 .Histogram Output: Output: 3. Box plots Output: 4. Heartmap Output: 5. Animation Output: 6. Interactivity Mousing clickigng Outp 阅读全文
posted @ 2017-06-11 18:44 ClimberClimb 阅读(408) 评论(0) 推荐(0)
摘要:1. Matplotlib Backend Layer Deals with th e rendering of plots to screen or files In jupyter notebooks, we use the inline backend Artist Layer Contain 阅读全文
posted @ 2017-06-10 19:25 ClimberClimb 阅读(424) 评论(0) 推荐(0)
摘要:1. Reading csv File 2. Dates and times demo # Numpy introduction Output 1 [1 2 3] 2 (2L, 3L) 3 4 The use of arange: 5 [ 0 2 4 6 8 10 12 14 16 18 20 22 阅读全文
posted @ 2017-06-04 12:07 ClimberClimb 阅读(322) 评论(0) 推荐(0)
摘要:1. Hierarchical clustering Avoid choosing number of clusters beforehand Dendrograms help visualize different clustering granularities (no need to reru 阅读全文
posted @ 2017-06-02 23:12 ClimberClimb 阅读(201) 评论(0) 推荐(0)
摘要:1. Mixed membership model This model wants to discover a set of memberships In contrast, cluster models aim at discovering a single membership In clus 阅读全文
posted @ 2017-06-02 22:34 ClimberClimb 阅读(144) 评论(0) 推荐(0)
摘要:1. Probabilistic clustering model (k-means) Hard assignments do not tell the full story, capture the uncertainty k-means only considers the cluster ce 阅读全文
posted @ 2017-06-01 23:24 ClimberClimb 阅读(196) 评论(0) 推荐(0)
摘要:,1.One nearest neighbor Input: Query article: Xq Corpus of documents (N docs): (X1, X2, X3,... ,XN) output : XNN = min disance(Xq, Xi) 2. K-NN Algorit 阅读全文
posted @ 2017-05-24 21:03 ClimberClimb 阅读(181) 评论(0) 推荐(0)
摘要:1. Precisoin and recall precision is how precise i am at showing good stuff on my website recall is how good i am at find all the postive reviews prec 阅读全文
posted @ 2017-05-18 23:21 ClimberClimb 阅读(142) 评论(0) 推荐(0)
摘要:1. Ensemble classifier Each classifier votes on prediction Ensemble model = sign(w1f1(xi) + w2f2(xi) + w3f3(xi)) w1 w2 w3 is the learning coefficients 阅读全文
posted @ 2017-05-17 22:10 ClimberClimb 阅读(144) 评论(0) 推荐(0)
摘要:1. Quality metric Quality metric for the desicion tree is the classification error error=number of incorrect predictions / number of examples 2. Greed 阅读全文
posted @ 2017-05-13 09:10 ClimberClimb 阅读(246) 评论(0) 推荐(0)
摘要:1. Linear classifier It will use training data to learn a weight or coefficient for each word. We use the gradient ascent to find the best model with 阅读全文
posted @ 2017-05-11 22:24 ClimberClimb 阅读(115) 评论(0) 推荐(0)
摘要:1. Fit locally If the true model changes much, we want to fit our function locally to different regions of the input space. 2. Scaled distance \ we pu 阅读全文
posted @ 2017-05-07 18:23 ClimberClimb 阅读(99) 评论(0) 推荐(0)
摘要:1. Feature selection Sometimes, we need to decrease the number of features Efficiency: With fewer features, we can compute quickly Interpretaility: wh 阅读全文
posted @ 2017-05-07 10:44 ClimberClimb 阅读(158) 评论(0) 推荐(0)
摘要:1. Ridge regression A way to automatically balance between bias and varaince situations and regulate overfitting when using many features. because the 阅读全文
posted @ 2017-05-05 23:41 ClimberClimb 阅读(153) 评论(0) 推荐(0)
摘要:1. Training Error Define a loss funtion like below: and the train error is defined as theaverage loss on houses in training set: and RMSE is simply th 阅读全文
posted @ 2017-05-04 22:23 ClimberClimb 阅读(134) 评论(0) 推荐(0)
摘要:1. Modeling seasonality w1 models the linear trend of the overall process. w2 models the seasonal component sinusoid with a period of 12 and you do no 阅读全文
posted @ 2017-05-02 22:22 ClimberClimb 阅读(130) 评论(0) 推荐(0)
摘要:1. Convex and concave functions Concave is the upside-down of the convex function and convex is a bow-shaped function 2. Stepsize common choice: as th 阅读全文
posted @ 2017-05-01 17:49 ClimberClimb 阅读(171) 评论(0) 推荐(0)
摘要:1. Multiple Features note:X0 is equal to 1 2. Feature Scaling Idea: make sure features are on a similiar scale, approximately a -1<Xi<1 range For exam 阅读全文
posted @ 2017-05-01 11:09 ClimberClimb 阅读(231) 评论(0) 推荐(0)
摘要:1.Linear Regression with One variable Linear Regression is supervised learning algorithm, Because the data set is given a right answer for each exampl 阅读全文
posted @ 2017-05-01 00:23 ClimberClimb 阅读(314) 评论(0) 推荐(0)

1