学术会议 Rebuttal 模板
Note that the author rebuttal is optional, and serves to provide you with an opportunity to rebut factual errors in the reviews, or to supply additional information requested by the reviewers.
The rebuttal is limited to 4000 characters. Please be concise and polite. Comments that are not to the point or offensive will make rejection of your paper more likely. Make sure to preserve anonymity in your rebuttal. Links to websites that reveal the authors’identities are not allowed and will be considered a violation of the double-blind policy. Links to websites with new figures, tables, videos or other materials are not allowed.
- 结构清晰，语法没毛病 （排版组织）
- 内容来自于 NIPS会议 公开的 Rebuttal，适用于计算机领域的学术会议 【 NIPS会议会把往年已接收的论文的 Rebuttal 贴出来，参见 http://papers.nips.cc/ （只给出了持正面意见的reviewer的意见）NIPS2013 NIPS2014 NIPS2015 】
- rebuttal 只会在你论文处于接收边缘的时候起作用，如果reviewers意见普遍很严厉，那么rebuttal的作用几乎可以忽略。当然如果 reviewers 意见普遍很好且没有提问，也可以不写 rebuttal。
- Rebuttal 是给审稿人和 area chair 看的。Confidential comment to Area Chair 一般用于举报审稿人（仅area chair可见），不用填。
- We thank the reviewers for acknowledging the strong performance of this work and the quality of the presentation. We address the comments as follows.
- 感谢好评 We thank the reviewers for their positive and constructive feedbacks.
- We thank all the reviewers for helpful comments.
- Thanks for all your feedback and suggestions. We will carefully incorporate them into our paper.
- Thanks for the helpful comments!
- Thank you for the feedback and suggestions, we will add clarification where needed and include suggestions as space permits.
- We thank the reviewers for their careful consideration. We greatly appreciate the positive comments and address major concerns below.
- 框架 We thank all the reviewers for their efforts. We start this rebuttal by reiterating our contributions, and then address specific concerns, especially those from AR6 where there has clearly been some misunderstanding leading to a serious error in his/her review. We kindly ask that AR6 revisits his/her review in light of our clarifications below.
- We thank all the reviewers. And we apologize for typos, grammar mistakes, unclear notations and missing citations. They will be corrected such that the overall writing meet NIPS standards. The below clarifications will be added in the paper or supplement either in form of texts or figures.
- Thanks to all the reviewers for their time and feedback. We provide some specific responses and clarifications.
- We'd like to thank the reviewers for their careful readings and valuable comments. We believe the constructive feedback will improve the paper and increase its potential impact to the community.
- First, we'd like to emphasize the contributions:
- We would like to emphasize that the novelty of the method, which addresses how to efficiently learn the dependency between latent variables without explicit knowledge of the model, has been accepted as valid and legitimate by the reviewers. We are confident this is a useful contribution for making generic inference viable in practice. Omitted comments will be fixed in revision if accepted.
- We thank the Reviewers for their constructive comments. We believe the model proposed is very powerful and theoretically deep. We agree with the reviewers that the exposition and experiments should be improved and will address this in the revision.
We will re-structure the paper to improve clarity. We will also add more details (and add an example, space permitting) and clarify our contributions (Section 4.4) for better understanding. We will also fix minor typos.
Then we address major concerns below.
We thank the reviewer for the encouraging comments.
一般而言，对于area chair，那个给分比较低的会自然吸引他的眼球，相对占得的权重也就大（这些是area chair的自己之谈），所以rebuttal就是要在有限的言辞里重点反驳这些reviewer。
R2: "The structure and writing is a concern"
We agree. This has been addressed in the arXiv version which has much cleaner structure and writing, including improved section on related work.
B. Apologies for being unclear in these parts of our paper, we address the individual points below, and will be more explicit on all of these in the full version.
\4. A list of minor typos.
For Section 2.3, there may be some misunderstanding. The reviewer is correct that a simple alternative to our approach would be to run MAP on the latent variables, and then hold the latent variables S fixed and Bayesian melding method on the model variables when fixing the latent variables. However, this is computationally expensive and does not scale to high dimensions, as the previous Bayesian melding method requires performing density estimation for the distribution tau. Instead we propose an approximate joint prior in Section 3, which allows us to infer the latent variables and model parameters jointly. Thus our algorithm scales better than the original Bayesian melding algorithm.
R4 Missing citations
Note that we do cite as  and discuss the work by Parks et al in L090-100. We further clarify the distinctions below. We will include the work by Demirkus et al in the next version and discuss head pose estimation below.
Thank you for the references, which are now included in the current draft.
Thanks for pointing out some valuable related work. The first two works do not consider any feature and instead consider the noise that occurs in observations. The third work is more application oriented using metric learning. Although we also demonstrate our model on a similar application - semi-supervised clustering, our work aims to provide a more general treatment to noisy features on matrix completion. In addition, their "uncertain side information" in fact corresponds to similar/dissimilar information between items in semi-supervised clustering, which means the uncertainty they consider is also on observations, while the noise we consider is on features. We are happy to include these related work in our final submission.
To Reviewer 8
\1. This paper lacks the references for some related recent works (e.g. [1, Nesterov 2015] and [2, Lan 2014]).
We have included Lan's conditional gradient sliding paper in the reference , which we believe is more relevant than .
We will include  in the final version.
\3. We thank the reviewer for mentioning the papers of Burer and Monteiro (2005) and Lee and Bresler (2009). Both papers are certainly relevant related work, and should be discussed. The Burer and Monteiro (B&M) paper (with which we were previously familiar but neglected to cite) is important, and gives a helpful traceback of the factorization and nonconvex optimization idea in the optimization literature. While related, our algorithm and analysis are substantially different than these works. Essentially, B&M target the general semidefinite programming problem and have a more complex set of first order techniques for nonconvex optimization (BFGS and augmented Lagrangian techniques, etc.) It would not be easy to do a direct numerical comparison; but we would expect our methods to perform comparably. In contrast, our method is clean and simple, targets a more limited class of problems, and correspondingly allows us to obtain a strong theoretical convergence analysis (the pesky extra factor of r notwithstanding). As stated by Burer and Monteiro (2003) "Although we are able to derive some amount of theoretical justification for [global convergence], our belief that the method is not strongly affected by the inherent nonconvexity [of the objective function] is largely experimental." We hope that our work will contribute to and help spur the further development of this important class of techniques.
We apologize for not clarifying all questions given the limited space and many reviews. We will fix all typos and add missing references in the next revision.
We will address all remaining minor suggestions in the final revision
We will thoroughly check and fix grammatical errors in the final submission.
- 评委是如何进行review NIPS2013 审稿指南
- 非常好的评价，没必要rebuttal了，比如 这个
- Rebuttal 成功的案例 (支持者/总评委数)
- npis28/reviews/1909 （2/6）
- nips28/reviews/1881 （4/9）
- nips28/reviews/1914 （3/6）
- nips28/reviews/1941 （2/3）
- nips28/reviews/1922 （4/7）
- nips28/reviews/1932 （4/7）
- nips28/reviews/1958 （4/6）
- nips28/reviews/1897 （4/6）
- nips28/reviews/1949 （4/6）
- nips28/reviews/1937 （3/4）
- nips28/reviews/1955 （5/7）
- nips28/reviews/1985 （6/7）
- nips28/reviews/1980 （5/5）
其他资料 See Also