Verification of a Predictor for Performance of Computer and Information Science Students in a Problem-Solving Course Robert M. Ryder ryder@cis.usouthal.edu Yuqin Pang pyuqin@hotmail.com The School of Computer & Information Sciences The University of South Alabama Mobile, AL 36688, USA Abstract The development of a simple 4-question tool predicting performance of computer and information science undergraduate students in a gateway problem-solving course was reported at ISECON 99. This paper reports the results of a second-year study, confirming that the predictor provides a useful correlation with the course final grade (Pearson r = 0.322). In addition, this follow-on research further suggests that stronger enforcement of course prerequisites in Fall 99 (7.8% increase in MATH ACT; 190% increase in precalculus) resulted in a 5.5% increase in predictor test score, and a 16.5% increase in final grade. Keywords: Predictor test, problem-solving courses, computer science education, student prerequisites. Introduction One of the greatest continuing debates in higher education centers on the prediction of success in college. No one knows for certain what predictors, if any, accurately determine whether a first year student will be an academic whiz, a drop out, or just your regular run-of-the-mill college student (Gutkowski, 1998). Therefore, how to develop an efficient predictor of success in academia is a critical issue for educators and students. The purpose of this research is to verify a predictive tool developed in Fall 1998 (Ryder and Waggener, 1999). This simple tool consists of four basic mathematical problems, which requires students enrolled in a problem-solving course to convert verbal problems to formulas. The University of South Alabama is a public institution located in the southern coast of Alabama with an enrollment of nearly 12,000 students. The School of Computer and Information Sciences (CIS) has over 500 undergraduates and offers a curriculum leading to Bachelor Degrees in Computer Information Science with specialization in Computer Science (CSC), Information Science (ISC) and Information Technology (ITE). The school also offers joint programs with the College of Engineering leading to a Bachelor of Science degree in Computer Engineering (CpE), and with the Business School leading to a Bachelor degree in E-Commerce. All CIS and CpE majors are required to complete the two-semester course, Problem-solving and Programming Concepts. This gateway course must be passed with a C or better grade in order to continue in the CIS program. A departmental heuristic states that a grade less than B in the gateway course is a fair predictor for academic failure in computer and information sciences. An ongoing faculty discussion continues on possible causes of poor student performance in the problem-solving course. Because this introductory course is crucial to students who enter the freshman year with an early interest in computer and information sciences, it is essential that faculty understand the nature of this problem. Perhaps a predictive model, incorporated in a comprehensive retention program, can support timely intervention in the early stages of matriculation. One observation is that many students have difficulty converting word problems to programmable formulas and algorithms. This observation led to the premise of Fall 98: Is there a simple predictor, based on transforming verbal problems to formulas, for undergraduate success in a gateway problem-solving course? The Fall 98 study accepted the hypothesis that a simple test in converting word problems to formulas is a good predictor of success in a problem-solving course. The purpose of the Fall 99 study was to reaffirm the predictor tool and investigate possible trends. Thus the null hypothesis is twofold: 1) The predictor test will not be reaffirmed as a good predictor of success in a problem-solving course and 2) Enforcing prerequisites for the course will have no effect on the level of student success. Background Predictive tools have been widely employed in academia despite a continuing debate as to which factors accurately predict academic success. High school grade point average (HGPA) is considered to be the best predictor of success in college (Chase and Jacob, 1989) (Wesley, 1994) although issues of generalizability, institution types, and internal validity remain under study (Gutkowski, 1998). While as many as fourteen independent variables have been employed in predictor studies (Turk, 1998), there is interest in determining whether a simple tool can be developed to meet the parochial needs of a CIS program (Ryder and Waggener, 1999). Other factors related to predicting academic success at the college level have been studied including social support (Arce, 1996) (Petrie and Stoever, 1997), levels of indecision and self-esteem (Arce, 1996), the utilization of first year seminars (Hyers and Joslin, 1998), participation in summer academic enrichment programs (Hesser et.al, 1998), classroom anxiety (Wilson, 1997), teacher effectiveness, and teacher characteristics (Dinnan and Moore, 1996). Many predictive tools employed in academia are directed to the requirements of specific disciplines. Some examples are presented. The attitudes and behaviors of nursing students, measured by the Myers-Briggs Type Indicator (MBTI) and the learning Orientation/Grade Orientation Scale II (LOGO II), were used to profile tendencies which contributed to or detracted from success in a nursing program (Barr, 1998). The College of Veterinary Medicine at the University of Illinois (UI) used various admission variables to predict subsequent success (Zachary and Schaeffer, 1994). The six-year study concluded that academic success was positively correlated with: 1) grade point averages and standardized test scores; and 2) a regression equation developed by the school based on objective factors used during the admissions process. It is interesting to note that academic success was not correlated with subjective evaluations such as applicant interviews. The University of Kentucky Law School relies heavily on an applicant's undergraduate GPA and performance on the Law School Admission Test (LSAT). Other academic factors include trend of college grades (strong undergraduate finish), letters of recommendation, previous graduate study, time interval between college graduation and application to law school, outside activities, and difficulty of undergraduate curriculum (University of Kentucky Law School, 1999). The California Psychological Inventory (CPI) has been studied by several researchers as a predictor of academic success. The studies focused on the non-cognitive variables of the CPI and how they served as valid indicators of academic success (Wida, 1997). Personality and cognitive predictors of performance in graduate business school were tested in a sample of Master of Business Administration (MBA) students (Rothstein and Paunonen, 1994). The predictor, performance in a developmental mathematics course and student age, was significantly related to academic success (Johnson, 1996). Group assessment of logic thinking (GALT) has been shown to be a predictor in college level chemistry courses (Bunce and Hutchinson, 1993). Chase and Jacobs (1989) found that high school grade point average (HGPA) (Pearson r=0.57) and rank (Pearson r=0.54) were the most significant predictors of freshman year GPA. Lambert and Ruiz (1988) at Syracuse University found that the strongest correlation to freshman year GPA was HGPA, followed by high school rank and SAT verbal. At the University of Pennsylvania, Baron and Norman (1992) studied the Standard Achievement Test (SAT), English composition skills, and class rank as predictors. They found that SAT scores were somewhat useful in predicting grades in a small selection of courses, including classes in Economics, Psychology, English, and several business courses. Young and Barrett (1992) developed a predictor tool, which included SAT verbal and math scores, rank in high school class (RIC), cumulative college GPA, and a dummy-coded variable (0 = did not complete bachelor's degree, 1 = completed bachelor's degree). They also developed a measure of the average academic rigor (AVGRIGOR) of a student's high school program and incorporated that variable into their predictor. They found that the most highly correlated variables for both GPA and Degree were AVGIGOR and RIC. But the significance of AVGRIGOR as a predictor could be diminished if high schools made adjustments to a students' rank in class when they choose to elect more difficult courses. In the first phase of this study, Ryder and Waggener (1999) demonstrated that a simple 4-question tool, which tested a student's ability to convert word problems into formulas, was a good predictor of success in a first-year problem-solving course. Methodology The four-question predictor test (Appendix) was administered to 135 students enrolled in problem-solving course during the first week of the Fall 1999 semester. An additional one page questionnaire was given to capture student name, student number, ACT/SAT mathematics score, gender, age, country of high school attendance, first time a problem-solving course was being taken (Yes/No), and specialization (CSC, ISC, ITE, CpE, or other), ACT/SAT scores were confirmed from the University's student database and corrected as required. Ethnicity was also determined and added as a variable. Three instructors, two of them teaching two sections each, were involved in the study. The data were analyzed using the Statistical Package for the Social Sciences (SPSS, version 8.0). The descriptive statistics included age, gender, ethnicity, major, and country of high school attendance. The results of the 4-question predictor test were correlated to ACT/SAT mathematics scores (N=43) and final grades (N=94). The reduced correlation sample size was due to two factors: 1) only 94 students completed the course; 2) only 69 students reported or had on file ACT/SAT scores and of this group, only 43 completed the course. The correlation of the Fall 1999 data were again determined using the Pearson r. A chi-square test was used to test for differences between the Fall 1998 and Fall 1999 test subjects. In addition, one-way ANOVA tests were performed for the means of the final grades with major, instructor, ethnic, class section, and the number of correct answers for predictor test. The t-test was used to study the means of final grades with gender. Note that this study required analyzing the data in three groupings: 1) all students who completed the predictor test (descriptive statistics; N=135); 2) participants who reported or had an MATH ACT score and completed the course (correlation of MATH ACT with final grade; N=43); 3) participants who completed both the predictor and the course (correlation of predictor with final grade; N=94). The third grouping is of most interest to this research. Results DESCRIPTIVE STATISTICS Parameter Fall 98 Fall 99 Number of participants 145 135 Number of males 101 105 Number of females 44 30 MATH ACT range 14-36 12-33 MATH ACT mean value 21.8 23.5 Average student age 21.8 22.0 Ratio of students with precalculus to students completing course 28/92 (30.4%) 83/94 (88.3%) Number of students with ACT score who complet-ed predictor and course 92 43 Figure 1. Descriptive statistics for Fall 98 and Fall 99 YEAR Fall 98 (n=145) Fall 99 (n=135) QUESTION 1 46 50 QUESTION 2 46 62 QUESTION 3 54 83 QUESTION 4 74 91 Figure 2. Number of times predictor question answered correctly YEAR Fall 98 (n=145) Fall 99 (n=135) 0 CORRECT 14 12 1 CORRECT 29 28 2 CORRECT 55 43 3 CORRECT 36 36 4 CORRECT 11 16 Figure 3. Number of students answering each question correctly Number of students Predictor score (mean value) Final grade (mean value) Specialization Fall 98 Fall 99 Fall 98 Fall 99 Fall 98 Fall 99 Computer Science 55 35 2.11 2.15 63.3 75.8 Information Science 32 29 1.75 2.45 53.6 74.5 Information Technology 17 28 1.67 1.74 56.5 72.8 Computer Engineering 26 31 2.27 2.35 69.3 78.6 Other 15 12 2.93 2.33 80.2 76.1 Figure 4. Comparison of results by specialization CORRELATION STATISTICS Correlation Parameters Fall 98 Fall 99 Predictor with Final Grade 0.3274 (n=92) 0.322 (n=94) MATH ACT with Predictor 0.3274 (n=92) 0.455 (n=43) MATH ACT with Final Grade 0.5578 (n=92) 0.125 (n=43) Figure 5. Correlation results Year Fall 98 (n=92) Fall 99 (n=94) 0 Correct 56.7 60.4 1 Correct 56.5 74.4 2 Correct 60.1 75.6 3 Correct 72.9 79.2 4 Correct 78.4 80.3 Total Mean 64.9 75.6 Figure 6. Mean of final grades vs. number of correct . Fall 98 MATH ACT < 24 MATH ACT >= 24 Precalculus? No Yes No Yes Number of Students 39 21 25 7 Final Grade 50.1 57 70.8 83.4 Fall 99 MATH ACT < 24 MATH ACT > =24 Precalculus? No Yes No Yes Number of Students (See 26 (See 17 Final Grade Note) 71.8 Note) 77.8 Note: The number of students not meeting precalculus prerequisites was too small to be significant. Figure 7. Comparison of Fall 98 and Fall 99 performance based on students meeting MATH ACT and precalculus prerequisites answers on predictor test. Parameter Fall 98 Fall 99 Improvement Prerequisites (mean values) MATH ACT 21.8 23.5 7.8% Precalculus 30.4% 88.3% 190% Outcome (mean values) Predictor 2.01 2.12 5.5% Final Grade 64.9 75.6 16.5% Figure 8. Improvements in student performance from Fall 98 to Fall 99 Findings 1. The correlation between the predictor and success in a problem-solving course is confirmed. The null hypothesis is rejected. (r=0.322, p< 0.01). 2. A major finding of this study, summarized in Figure8, suggests that enforcing course prerequisites improves final outcome in terms of higher predictor scores and final grades. 3. The predictor test appears to measure student ability to convert simple word problems into formulas and may measure student ability to comprehend programming concepts such as looping and conditionals (if/then). 4. This study will continue to confirm if stronger enforcement of prerequisites will improve student performance. 5. Other questions of interest need to be resolved: The CIS School suggests that a grade less than B in the problem-solving course may be a indicator of failure in the CIS curriculum. Is there a correlation between final grade and rate of success in completing CIS requirements for degree? What is the effect of student persistence? Will students who repeat the course have a better outcome than might be predicted otherwise? 6. While the utility of this predictor should not be overstated, it has value because: ACT/SAT scores are not available for a significant number of students, especially foreign students who comprise a large part of our enrollment. The knowledge level of students who have completed precalculus may vary considerably. Our survey group included students who completed high school in 16 different countries. The predictor takes about 30 minutes to complete and may provide an immediate and early identification of students who may require intervention such as additional testing or completion of developmental course work in algebra. References Arce, E. 1996, "The effects of social support and self- esteem on career indecision: a cross-cultural comparison between two groups of undergraduate students" 77th Annual meeting of the American Educational Research Association. New York NY, April 11. Baron, J. and M. Norman, 1992, "SATs, achievement tests, and high school class rank as predictors of college performance" Educational and Psychological Measurement, 52:1047-1055. Barr, J. 1998, "Predicting academic success in the nursing program: attitudes and behaviors of undergraduate nursing students toward their collegiate nursing education experience" NLN Educational Summit. Refocusing the Lens: Nursing Education for the New Millennium, September 24-26. Bunce, D. and K. Hutchinson, 1993, "The use of GALT (Group Assessment of Logical Thinking) as a predictor of academic success" Journal of Chemical Education, Mar. 1993, 70:3:183, 5p, 6 charts. Chase, C. and L. Jacobs, 1989, "Predicting college success: The utility of high school achievement averages based only on academic courses" College and University, Summer 1989, 403-8. Dinnan, J. and A. Moore, 1996, "Teacher characteristics as predictors of reading improvement among adult basic and secondary education students" 77th Annual Meeting of the American Educational Research Association. New York NY, April 12. Gutkowski, J. 1998, "Prediction of success of college students" School of Education, University of Michigan. www-personal.umich.edu/~joeg/success.html Hamburg, M. and P. Young, 1995, "Statistical Analysis for Decision Making" 6th Edition, Appendix A, Wadsworth Publishing Co., Belmont, CA. Hesser, A. and L. Cregler, and L. Lewis, 1998, "Predicting the admission into medical school of African American college students who have participated in summer academic enrichment programs" Academic Medicine, February 1998, 73:2:187-91. Hyers, A. and M. Joslin, 1998, "The first year seminar as a predictor of academic achievement and persistence" Journal of the Freshman Year Experience & Students in Transition. 10:1:7-30. Johnson, L. 1996, "Developmental performance as a predictor of academic success in entry-level college mathematics" Community College Journal of Research and Practice, Jul-Aug 1996, 20:4:333-344. Lambert, L. and L. Ruiz, 1988, "College courses in high school: Implications for enrollment management" College and University, Winter 1988, 140-147. Petrie, T. and S. Stoever, 1997, "Academic and non-academic predictors of female student athletes in academic performance" Journal of College Student Development, Nov-Dec 1997, 36:6:599-608. Rothstein, M. and S. Paunonen, 1994, "Personality and cognitive ability predictors of performance in graduate business school" Journal of Educational Psychology, Dec 1994, 86:4:516, 15p, 6 charts. Ryder R. and A. Waggener, 1999, "A predictor for performance of computer and information science freshman in a problem-solving course" Proceedings ISECON'99, October, pp. 120-25. Turk, E. 1998, "Predictors of Academic Success At Mercer University" Georgia Sociological Association Annual Conference. Undergraduate Student Paper Award. www.mercer.edu/sociology University of Kentucky Law School, 1999, "Academic factors for admission Law School" www.uky.edu/Law/admsn/academic.html Wesley, J. 1994, "Effects of ability, high school, achievement, and procrastinator behavior on college performance" Educational and Psychological Measurement 54:2:404-8. Wida, K. 1997, "The CPI as a predictor of academic success" ERIC Document # ED412463, August. Wilson, V. 1997, "Factors related to anxiety in graduate statistics classroom" Annual Meeting of the Mid-South Educational Research Association. Memphis, TN, Nov 12-14 Young, J. and C. Barrett, 1992, "Analyzing high school transcripts to improve prediction of college performance" The Journal of College Admission, Fall 1992, 25-29. Zachary J. and D. Schaeffer, 1994, "Correlation between pre-veterinary admission variables and academic success in core courses during the first two years of the veterinary curriculum" Journal of Veterinary Medical Education 21:2. Appendix: The Four-Question Predictor Test 1. The XYZ Fence Company builds rectangular fences with dimensions x and y, where x and y are in whole feet. Each fence is constructed from 6-inch wide wooden boards spaced 6 inches apart as shown below (Note: a figure is provided with this question). The fenced-in area can have one or more 3-foot gates, which are added later; however, you must leave 3 feet of space for each gate. The formula which computes the number of 6-inch boards required for a fence with two gates is: a. 2(x+y)/2 -3 b. 2(x/2 + y/2) - 6 c. 2(x+y) -6 d. (2x + 2y)/2 -6 e. no correct answer 2. The overtime pay rate at the Happy Toys Company is one and one-half times the regular pay rate. For example, if the regular pay rate is $10/hour, the overtime rate is $15/hour. For a weekly payroll, the overtime rate is applied to all hours worked in excess of 40 hours/week. If the hourly rate is P, what formula calculates total pay for h hours where h is greater than 40 hours? a. 1.5P(h-40) b. [40 + (h-40)1.5]P c. (40-h)1.5 d. (40P + 1.5P)h e. no correct answer 3. Your credit card balance is $1000 and the monthly interest rate is 1%. If you do not pay off the full amount at the end of the month, 1% of the amount owed is added to your balance due, then any payment is subtracted. For example, if you owe $1000 and made a payment of $100, the new balance would be $910 calculated as follows: $1000 (balance) + $10 (interest) - $100 (payment) = $910. Starting with a balance of $1000, you make three consecutive monthly payments of $110, $209 and $300. Because you do not pay off the full amount owed, one percent of the balance owed is added to your account each month. After making the payments noted, what is the balance due on your next statement? a. $400.76 b. $407 c. $409 d. $410 e. no correct answer 4. A utility company's electricity charges are based on the number of kilowatts (KW) used during a month, that is, the more a customer uses, the higher the rate. The rates are as follows: KW used rate per KW Less than or equal to 1000KW $0.05 (for the first 1000 KW) 1001KW to 2000KW $0.07 (for the second 1000KW) More than 2000 KW $0.10 (for remaining KW over 2000) For example, if a customer used 1100 KW during the month, the charge would be computed as follows: 1000KW x $0.05/KW + 100KW x $0.07/KW = $50 + $7 = $57 What would your bill be if your monthly usage was 2012 KW? a. $121.23 b. $121.20 c. $121.1 d. $120.10 e. no correct answer