Higher Education targets to develop complex theoretical, abstract and analytical reasoning capabilities in the alumni. This objective can be accomplished addressing four major steps: Theoretical foundation, practice, communication and assessment (Petry, 2002). Theoretical background and practical exercise comprise the basic knowledge building process at initial stages. Assessment guides the alumni through higher complexity studies permitting the student to identify the weak points in the knowledge building where further theory study and/or practice is required. Theoretical foundation and problem-solving practice are well known aspects in Higher Education. High-quality materials in printed and electronic format, sometimes including multimedia –audio, video or computer graphics–, sometimes delivered through the Internet are readily available today. Teaching-aids as computer-feed overhead projectors or electronic blackboards in the classroom are common place and facilitate the knowledgebuilding process. Moreover, computers in the classroom are a powerful tool linking theory and problem-solving practice in engineering studies (Beyerlein et al., 1993). On the other hand, the assessment process has been evolving slowly in the last decades. Pen-and-paper examination techniques have been translated to the computer-enabled classroom as software applications that present an exam in the screen and record the student answers. This process can be seen as an external assessment targeting to evaluate the skills of the student in order to give a pass/fail on a given subject. The external evaluation can be useful for the student in order to know the skill level, but usually fails short when the student wants to now “what’s wrong”, i.e. to know not only what question was missed but also what knowledge areas the student is finding difficulties. Weak areas identification cannot be done from a single question or set of questions. It requires the assessment process, examination or similar, to be considered as a whole. Student attention time or time spent thinking on a specific question –relative to the other question, as some students think faster than othersclearly indicates the areas where difficulties hide, by example. The pattern followed when answering the exam questions, by example, is another useful indicator. The student will try to answer first the questions he feels more comfortable with. Dubitation on the answer –change the answer several times– is another useful parameter. All these parameters and many others can be compiled and processed by the unprecedented analytic processing capabilities of modern data mining techniques. Collaborative assessment appears as a the natural evolution from the individual learning to the collaborative learning, which was proposed as a suitable technique to speed-up development of analytical reasoning as different approaches are continuously suggested
[1]
Yen-Ting Lin,et al.
Automatic Leveling System for E-Learning Examination Pool Using Entropy-Based Decision Tree
,
2005,
ICWL.
[2]
Paul Gray,et al.
Introduction to Data Mining and Knowledge Discovery
,
1998,
Proceedings of the Thirty-First Hawaii International Conference on System Sciences.
[3]
Ian H. Witten,et al.
Data mining: practical machine learning tools and techniques, 3rd Edition
,
1999
.
[4]
D. Keith Lupton.
Portfolio versus syllabus methods in experiential education
,
1979
.
[5]
J K Conn.
Deep pockets.
,
1992,
The Journal of the Florida Medical Association.
[6]
Petra Perner,et al.
Data Mining - Concepts and Techniques
,
2002,
Künstliche Intell..
[7]
Brian Larson.
Delivering Business Intelligence with Microsoft SQL Server 2012
,
2006
.
[8]
Karl A. Smith,et al.
Cooperative learning: effective teamwork for engineering classrooms
,
1995,
Proceedings Frontiers in Education 1995 25th Annual Conference. Engineering Education for the 21st Century.
[9]
M. Morant,et al.
THE CONTROL PANEL: A DEEP DATA-MINING TECHNIQUE FOR THE LECTURING OF ENGINEERING-RELATED STUDIES
,
2009
.
[10]
W. F. Punch,et al.
Predicting student performance: an application of data mining methods with an educational Web-based system
,
2003,
33rd Annual Frontiers in Education, 2003. FIE 2003..
[11]
Ian Witten,et al.
Data Mining
,
2000
.
[12]
Julian Lonbay,et al.
THE EUROPEAN HIGHER EDUCATION AREA: TWO STEPS CLOSER
,
2005
.
[13]
William C. Rau,et al.
Humanizing the College Classroom: Collaborative Learning and Social Organization among Students.
,
1990
.
[14]
Witold Pedrycz,et al.
Data Mining: A Knowledge Discovery Approach
,
2007
.
[15]
Kalina Yacef,et al.
Educational Data Mining: a Case Study
,
2005,
AIED.
[16]
S. Sharan,et al.
Expanding cooperative learning through group investigation
,
1992
.
[17]
Maria Morant,et al.
Accurate knowledge evaluation by deep datamining in Telecommunication Engineering studies
,
2009,
2009 EAEEIE Annual Conference.
[18]
Osmar R. Zaïane,et al.
Web Usage Mining for a Better Web-Based Learning Environment
,
2001
.
[19]
M Pringle,et al.
Variations in admission rates.
,
1987,
British medical journal.
[20]
E. Petry.
Architectural education: evaluation and assessment
,
2002,
32nd Annual Frontiers in Education.
[21]
A. Gokhale.
Collaborative Learning Enhances Critical Thinking
,
1995
.
[22]
Steven Beyerlein,et al.
Using a learning process model to enhance learning with technology
,
1993,
Proceedings of IEEE Frontiers in Education Conference - FIE '93.
[23]
César Hervás-Martínez,et al.
Data Mining Algorithms to Classify Students
,
2008,
EDM.
[24]
Garey Ramey,et al.
Donor Behavior and Voluntary Support for Higher Education Institutions.
,
1988
.
[25]
Ian H. Witten,et al.
Data Mining: Practical Machine Learning Tools and Techniques, 3/E
,
2014
.