March/April 2002 // Vision
Computerizing College Composition
by Joel Foreman
Note: This article was originally published in The Technology Source (http://ts.mivu.org/) as: Joel Foreman "Computerizing College Composition" The Technology Source, March/April 2002. Available online at http://ts.mivu.org/default.asp?show=article&id=1034. The article is reprinted here with permission of the publisher.

Consider the plight of college composition teachers. Embedded in an educational system shaped by paper-based modes of production and the economics of collocation, they must expend significant amounts of mind-deadening instructional effort reading, commenting upon, and managing student documents.

The advent of Web-based learning technologies offers hope that this situation will change for the better one day. Course management systems such as WebCT and the specialized writing support systems Daedalus Online and ConnectWeb offer varying degrees of automated document management. Meanwhile, the practical applications of such researchers as Daniel Anderson and Fred Kemp are exploring the efficiencies that might be afforded by writing instruction cyber-environments (Foreman, 2000a, 2000b).

Though progress is relatively slow, I believe we should be heartened by a number of promising technologies developed in the commercial domain:

  • Network analysis and tracking tools such as SilentRunner;
  • The first generation of SCORM compliant learning content management systems;
  • Intelligent agent technologies such as NativeMinds;
  • Business to business (B2B) applications that integrate the computer systems of collaborating companies;
  • Datamining tools from such vendors as MicroStrategy; and
  • Collaborative tool suites such as InfoWorkSpace.

I suggest that the features and functions of such tools will eventually converge to form an instructional solution for the labor problem I have noted above. The following article addresses the problem and the potential solution.

A Tough Job

I begin with a belief formed over 30 years of teaching and classroom observation: it takes talent, commitment, and years of experience to produce excellent writing instruction. In particular, I have in mind the high-order competence needed to produce incisive and helpful commentary on a semester-long stream of student papers. Writing instructors will easily spend more than half of their instructional effort reading student work and then providing feedback. As such, the feedback system plays (and should play) a major role in student learning.

It is not unreasonable for a writing instructor to assign one significant assignment each week, and thus to read and comment upon 1,000 pages per course each semester. Having to read this mass of material (which is frequently boring, repetitious, and poorly written) is exceptionally tedious work. Worse yet, the instructor must read thoughtfully and skillfully so as to provide cogent, clear, precise, and relevant feedback in a very short period of time.

Now consider the conditions of the labor force that is called upon to perform such important work. Let my own university, George Mason, serve as an example. We teach approximately 290 required composition courses annually. We have five tenured/tenure line compositionists who teach a handful of these courses. The rest are taught by 30 or so teaching assistants, about 30 part-time adjuncts, and seven non-tenured but full-time instructors who each teach four courses per semester.

What can be said about the performance of these groups?

The tenured/tenure line group consists of professional compositionists, with relatively light teaching loads, who can be expected to provide instruction at a very high level. They teach a small fraction of the composition courses. The TAs, though earnest, committed, and ably directed by the English Department, are learning on-the-job a set of skills that I believe take years of experience to perform well. Perhaps a third of the adjuncts love to teach and do it exceptionally well. Many of the remainder are overworked because they teach multiple sections at multiple universities, and most (being on a part-time basis) lack the incentives—e.g., a reasonable wage—that tend to produce a higher degree of commitment. The last group, non-tenured full-time instructors, must teach four courses per semester and are thus on a teaching/grading treadmill that makes it difficult to produce consistent and carefully considered instruction (note that the situation is worse yet on the community college level, where composition instructors must typically teach five sections per semester).

The consequence, at the very least, is that college writing instruction is not as good as we should expect it to be.

Information Technology to the Rescue

How can information technology help with regard to the time-consuming and complex management of student papers?

First, software developers can provide better tools to help student writers detect and correct problems during and immediately after the acts of composition and revision. The more problems the students can self-correct, the less arduous the grader's task becomes.

Second, sophisticated workflow tracking and routing systems can lessen the instructor's load by (a) managing the flow of student documents through the grading cycle, (b) automating the collection of performance data, and (c) providing agent-based, intelligent tutoring for work on the common grammatical and rhetorical challenges.

Some specifics follow.

Warehousing Student Work

SilentRunner and MicroStrategy currently provide network and business intelligence analysts with collection, visualization, and analysis systems that sift through very large sets of network data and organize it into user-friendly and interactive visual displays. Using similar technologies, an advanced writing instruction support system would replace the conventional roll book with a "virtual dashboard" from which an instructor would monitor and control all the document flows in a course data-warehouse. Student documents would be tagged with appropriate identifiers so that they could be tracked, retrieved and represented according to an instructor's preferences. Thus all student documents would be automatically represented in one of several available views on the instructor's dashboard. Depending on need, an instructor might view a given class assignment to see if all submissions had been posted on time, view the status of all ongoing revision work, or access a single student folder to see a display of that student's grades and to review a specific submission.

Students would treat the composition portal as their application service provider (ASP) and do all their work with an online word processor. The system would provide spell and grammar checkers and the latter would include an agent able to explain and illustrate the problems it detects, thus eliminating one of the basic shortcomings of current grammar checkers. NativeMinds could provide the technology to power such a system. Just imagine students interacting with a virtual grammar guru that adapts to their specific needs and helps them to master the targeted grammar rules.

The grammar checker would also assist students in predictable problem areas. It could, for example, place a lock on a text until a student writer responds to a query like, "Which transitional device have you used to bridge these two paragraphs?" It could flag the excessive repetition of words, check for variety in the length of sentences and paragraphs, and aggregate the first sentences of all paragraphs to form a sentence outline (as a basic check of organizational design) with the accompanying prompt: "Do these topics hang together? Is there a logical flow from topic to topic?" All of these functions are already present, to a degree, in such tools as Microsoft Word and the Copernic Summarizer.

A scan for plagiarism would also be automated with advanced algorithms such as those in current use by turnitin.com. The scanner would compare student submissions to all the submissions in the warehouse, to research material used to construct the assignment, and to relevant Web sites that might have been pilfered by students. Detected plagiarisms would be revealed to their "authors" so that they might be eliminated before submission. In addition, a tag would be embedded in the text so that a second violation would register on the instructor's dashboard for an appropriate intervention.

Having student documents coded and stored in a data warehouse also has consequences for student peer review, a practice that is very much in favor as a positive learning experience for both writer and reviewer. One of the problems with peer review is that it is very difficult to keep track of who reviews whom and how well. Functions currently deployed in collaborative tools such as InfoWorkSpace and ipTeam and learning management systems such as Plateau could help as follows.

To qualify a submission for instructor review, a student would first need to have one or more peer reviews. A workflow system would guide a document through the review cycle while a version control system (equipped with an autodetect feature that reports changes in newly uploaded documents) would enable an instructor to track the review process from his dashboard. In addition, the reviews would be embedded in all student submissions (a function already possible in Microsoft Word) but only visible if and when the instructor wished to check them.

Thus, by the time a student document actually appears for evaluation on an instructor's screen, various problems will have already been addressed through the Web-based medium.

Automating Instructor Feedback

On the instructor side, a feedback system would employ a hyperlinked comment insertion feature, provide relevant online tutorials, and keep track of the student's performance.

What I have in mind would update the conventional writing-handbook model by which an instructor places symbols on a student document. To decipher the symbols, the student refers to a key in a handbook and is referred to appropriate examples and exercises. A computerized version would make the process accountable and thus instructionally meaningful. Using an autotext function such as that already available in Microsoft Word, the instructor would embed pre-recorded comments in a student document. The database would record and aggregate all the comments in a given student's text and append an itemized report in the assignment section of the instructor's dashboard.

Thereafter, when reviewing a document, the student would click on the portions of text highlighted by the instructor, and travel via hyperlink to an online handbook. ConnectWeb already provides such a feature, but stops short of true interactivity because the student is left to passively consume the material in the online handbook. In an advanced system (Carnegie Learning's cognitive tutors are representative of what I have in mind) the student would work through a relevant tutorial and be tested on a given skill until able to demonstrate a high level of competency with problem detection and correction. At this point, the system would notify the instructor that the student had completed the required activity.

The intent is that the student would eventually internalize the rules needed to master various aspects of good writing. If repeated exposures to the online tutorial do not work, a dashboard flag would inform the instructor that a more aggressive intervention is required. At this point, the instructor would query the database to determine if a number of students have similar problems and thus might be grouped together for some intensive care.

When a student completed a course (or the need to repeat the course was established), an archive of the student's work and an itemized performance record would be available to guide future instruction. This would be a major improvement over the current system, which provides no record whatsoever (other than a grade), ensuring only that writing instructors are always starting from ground zero with regard to an understanding of new student needs.

Realizing the Vision

Realizing the system I have described above will probably require two related items: (1) a substantial amount of money and (2) national standards for writing instruction in higher education.

The estimates of Anderson (1992) and Murray (1999) indicate that the intelligent grammar tutorial could, by itself, easily cost one million dollars. Fred McFarland, who has been directly involved with the development of the different versions of Connect, estimates that the entire system (including design, development, testing, and marketing) could cost two or three million dollars and take three years to get to the customer base. Though these sums are significant, they are not the major impediment to the development of advanced writing support systems. McFarland notes that the real problem is the customer base itself: less than 3% of composition instructors currently use any of the available writing support systems (personal communication, December 12, 2001).

The social obstacle is the fierce independence of compositionists in their capacity as teachers and directors of composition programs. From their perspective, standards usually look like unwelcome and adverse dictates. In this respect, compositionists are not unlike the rest of us: we are all free market citizens who do not want to be told which tools or techniques to use. We want to adopt willingly the ones that work best for us.

With regard to the evolution and widespread implementation of sophisticated writing instruction support systems, I anticipate the gradual formation of a mutually reinforcing set of instructional standards and instructional technologies. First, software developers will deploy applications in a series of generations. This is already taking place. Second, a mounting base of users will adapt to the tools, request changes, and thereby assist in the evolution of standards. Third, the ensuing cascade effect will result in the emergence of an instructional application that performs according to many of the specifications I have described above and gains wide user acceptance.

The potential benefits are considerable. Carol Tripp, director of the Pew Learning and Technology Program, cites as evidence the yearly cost of teaching composition at Brigham Young University and at Tallahassee Community College: $616,480 and $757,294, respectively (personal communication, December 12, 2001). If we consider those figures in relation to the number of composition programs across the country, we can begin to imagine the kind of economies that might be produced by a writing support system that reduces instructor labor (or refocuses it on high-yield learning activities) and improves instruction. Given the ubiquity of college writing as a medium for student performance evaluation, the beneficiaries would include students and teachers in any course that requires written reports.

References

Anderson, J. R. (1992). Intelligent tutoring and high school mathematics. In C. Fasson, G. Gauthier, & G. I. McCalla (Eds.), Proceedings of the Second International Conference on Intelligent Tutoring Systems. Berlin, Germany: SpringVerlag.

Foreman, J. (2000a, September). The leading edge of webcentric writing instruction. Converge. Retrieved January 14, 2002, from http://www.convergemag.com/magazine/ story.phtml?id=2030000000001274

Foreman, J. (2000b, January). The paperless classroom. Converge. Retrieved January 14, 2002, from http://www.convergemag.com/magazine/ story.phtml?id=2530000000001000

Murray, T. (1999). Authoring intelligent tutoring systems: Analysis of the state of the art. International Journal of Artificial Intelligence in Education 10(1), 98-129.

pc game downloadspc gamescard gamesmanagement gameshidden object games
View Related Articles >