Identifying collaboration between learners in a course is an important challenge in education for two reasons: First, depending on the courses’ rules, collaboration can be considered a form of cheating. Second, it helps one to more accurately evaluate each learner’s competence. While such collaboration identification is already challenging in traditional classroom settings consisting of a small number of learners, the problem is greatly exacerbated in the context of both online courses or massively open online courses (MOOCs) where potentially thousands of learners have little or no contact with the course instructor. In this work, we propose a novel methodology for collaboration-type identification, which both identifies learners who are likely collaborating and also classifies the type of collaboration employed. Under a fully Bayesian setting, we infer the probability of learners’ succeeding on a series of test items solely based on graded response data. We then use this information to jointly compute the likelihood that two learners were collaborating and what collaboration model (or type) was used. We demonstrate the efficacy of the proposed methods on both synthetic and real-world educational data; for the latter, the proposed methods find strong evidence of collaboration among learners in two non-collaborative takehome exams.
How to Cite
collaboration-type identification, Bayesian Rasch, sparse factor analysis, collaboration
BERGNER, Y., DROSCHLER, S., KORTEMEYER, G., RAYYAN, S., SEATON, D., AND PRITCHARD, D. 2012. Model-based collaborative filtering analysis of student response data: Machine-learning item response theory. In Proc. 5th Intl. Conf. Educational Data Mining. Chania, Greece, 95–102.
BUTLER, A. C. AND ROEDIGER, H. L. 2008. Feedback enhances the positive effects and reduces the negative effects of multiple-choice testing. Memory & Cognition 36, 3 (Apr.), 604–616.
CHAFFIN, W. W. 1979. Dangers in using the Z index for detection of cheating on tests. Psychological Reports 45, 776–778.
CHIB, S. AND GREENBERG, E. 1998. Analysis of multivariate probit models. Biometrika 85, 2 (June), 347–361.
FRARY, R. B. 1993. Statistical detection of multiple-choice answer copying: Review and commentary. Applied Measurement in Education 6, 2, 153–165.
GELMAN, A., ROBERT, C., CHOPIN, N., AND ROUSSEAU, J. 1995. Bayesian Data Analysis. CRCPress.
HALADYNA, T. M., DOWNING, S. M., AND RODRIGUEZ, M. C. 2002. A review of multiple-choice item-writing guidelines for classroom assessment. Applied measurement in education 15, 3, 309– 333.
HOFF, P. D. 2009. A First Course in Bayesian Statistical Methods. Springer Verlag.
LAN, A. S., WATERS, A. E., STUDER, C., AND BARANIUK, R. G. 2013. Sparse factor analysis for learning and content analytics. submitted to Journal of Machine Learning Research,.
LEVINE, M. V. AND DONALD, B. R. 1979. Measuring the appropriatemess of multiple-choice test scores. Journal of Educational Statistics 4, 5 (Winter), 269–290.
NWANA, H. S. 1990. Intelligent tutoring systems: an overview. Artificial Intelligence Review 4, 4, 251– 277.
PAPPANO, L. 2012. The year of the MOOC. The New York Times.
RASCH, G. 1960. Studies in Mathematical Psychology: I. Probabilistic Models for Some Intelligence and Attainment Tests. Nielsen & Lydiche.
RASCH, G. 1993. Probabilistic Models for Some Intelligence and Attainment Tests. MESA Press.
RODRIGUEZ, M. C. 1997. The art & science of item writing: A meta-analysis of multiple-choice item format effects. In annual meeting of the American Education Research Association, Chicago, IL.
SCHMIDT, M. N., WINTHER, O., AND HANSEN, L. K. 2009. Bayesian non-negative matrix factorization. In Independent Component Analysis and Signal Separation. Vol. 5441. 540–547.
SOTARIDONA, L. AND MEIJER, R. 2002. Statistical properties of the k-index for detecting answer copying. Journal of Educational Measurement 39, 2, 115–132.
SOTARIDONA, L. S. AND MEIJER, R. R. 2003. Two new statistics to detect answer copying. Journal of Educational Measurement 40, 1, 53–69.
WATERS, A. E., STUDER, C., AND BARANIUK, R. G. 2013. Bayesian pairwise collaboration detection in educational datasets. In Proc. IEEE Global Conf. on Sig. and Info. Proc. (GlobalSIP). Austin, TX.
WESOLOWSKY, G. O. 2000. Detection excessive similarity in answers on multiple choice exams. Journal of Applied Statistics 27, 7 (Aug.), 909–921.
WESTFALL, P. H., JOHNSON, W. O., AND UTTS, J. M. 1997. A bayesian perspective on the bonferroni adjustment. Biometrika 84, 2, 419–427.
WOLLACK, J. A. 2003. Comparison of answer copying indices with real data. Journal of Educational Measurement 40, 3, 189–205.
Authors who publish with this journal agree to the following terms:
- The Author retains copyright in the Work, where the term “Work” shall include all digital objects that may result in subsequent electronic publication or distribution.
- Upon acceptance of the Work, the author shall grant to the Publisher the right of first publication of the Work.
- The Author shall grant to the Publisher and its agents the nonexclusive perpetual right and license to publish, archive, and make accessible the Work in whole or in part in all forms of media now or hereafter known under a Creative Commons 4.0 License (Attribution-Noncommercial-No Derivatives 4.0 International), or its equivalent, which, for the avoidance of doubt, allows others to copy, distribute, and transmit the Work under the following conditions:
- Attribution—other users must attribute the Work in the manner specified by the author as indicated on the journal Web site;
- Noncommercial—other users (including Publisher) may not use this Work for commercial purposes;
- No Derivative Works—other users (including Publisher) may not alter, transform, or build upon this Work,with the understanding that any of the above conditions can be waived with permission from the Author and that where the Work or any of its elements is in the public domain under applicable law, that status is in no way affected by the license.
- The Author is able to enter into separate, additional contractual arrangements for the nonexclusive distribution of the journal's published version of the Work (e.g., post it to an institutional repository or publish it in a book), as long as there is provided in the document an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post online a pre-publication manuscript (but not the Publisher’s final formatted PDF version of the Work) in institutional repositories or on their Websites prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (see The Effect of Open Access). Any such posting made before acceptance and publication of the Work shall be updated upon publication to include a reference to the Publisher-assigned DOI (Digital Object Identifier) and a link to the online abstract for the final published Work in the Journal.
- Upon Publisher’s request, the Author agrees to furnish promptly to Publisher, at the Author’s own expense, written evidence of the permissions, licenses, and consents for use of third-party material included within the Work, except as determined by Publisher to be covered by the principles of Fair Use.
- The Author represents and warrants that:
- the Work is the Author’s original work;
- the Author has not transferred, and will not transfer, exclusive rights in the Work to any third party;
- the Work is not pending review or under consideration by another publisher;
- the Work has not previously been published;
- the Work contains no misrepresentation or infringement of the Work or property of other authors or third parties; and
- the Work contains no libel, invasion of privacy, or other unlawful matter.
- The Author agrees to indemnify and hold Publisher harmless from Author’s breach of the representations and warranties contained in Paragraph 6 above, as well as any claim or proceeding relating to Publisher’s use and publication of any content contained in the Work, including third-party content.