In the way I perceive the matter, the answer is qualified for one's beliefs regarding teaching and learning. I embrace the Gestalt theory approach for learning and teaching. Gestalt theory states that "to know the thing you need to be the thing." This approach has its limitations, for it assumes that we never will be able to know what a tree, or a rock, are, since we cannot experience their existence. We can describe their attributes but we cannot understand their essence. Humans can only understand (?!) other humans and social constructs.
On the other hand, this approach has the wonderful consequence of emphasizing the hands-on, experiential, learning and teaching process -- the learning experiences should transform the subjects (students) by providing opportunities to *be* what is being taught. This is the purpose of role-playing exercises, business cases, presentations, hands-on computer assignments, etc.
That is not to say that the subjects should not be exposed to "concepts" (social construct frames of reference), and only be given "experiential" opportunities. What this is saying is that the centerpiece of the learning process are the experiential opportunities, while the concepts are what the subjects are learning through these experiences. Teaching, therefore, is the design and management of these concepts and experiences with the objective of allowing the subjects to acquire knowledge regarding a topic.
In this context, what should be a classroom? The environment that will facilitate teaching and learning. It should not however, be considered as a single unit -- just a room with certain educational facilities (electronic or not). Its concept should be extended to include small group discussion rooms, labs, external sites (like a business or courtroom that is visited by the subjects), and students homes and/or offices, that, together, facilitate the learning process -- the extended classroom concept. It is the environment that is relevant to the knowledge of a topic that needs to be represented by the extended classroom. The whole notion of a multi-purpose classroom is a fallacy -- classrooms are different for different topics. Professors have been, conscientiously or unconscientiously, adapting existing educational facilities to create this environment-relevant classroom.
I will try to give my contribution regarding teaching Information Systems and Information Technology (IS/IT), rather than try to say what is good for business, medicine, law, engineering, etc, or for a generic (utopian?!) classroom. So the question becomes: what is the environment needed for students to learn IS/IT? More than 60% of the public and private organizations in the US are using PCS or workstations, connect ed to networks. Therefore, the first requirement is that each student in the (extend) classroom should have access to a PC/workstation connected to a network. This is the environment in which our students are/will be working and all teaching and learning should be embedded in this environment.
The teaching and learning of IS/IT for business is centered in three basic roles students will perform in their jobs: (a) user of IS/IT to perform their jobs, (b) first-line management, and ( c) top management. Of course, students in an MIS major or concentration will also have a fourth role: (d) MIS specialist. The environment needs from an user perspective are access in the PCS/workstations to productivity tools (office suites and the like), computer mediated teamwork tools (group support systems like Lotus Notes, brainstorming, Delphi, video conferencing, etc), and information retrieval, visualization and analysis tools (such as the Internet, particularly the Web, on-line specialized data bases, graphics, etc). First-line management and top management education require, in addition to the user requirements, exposure to decision-making situations (cases, simulations, field studies) and management support systems tools (such as project management, planning and control systems, executive information systems).
The question now is: how to map this environment needs into the extended classroom facilities? What components of this environment should be present in a lecturing hall (traditional classroom), what should be present in specialized labs, what should be present in external sites and what should be present in the students home and/or office? The learning environment needs to be designed as a whole. It is not enough to have access to the Web (for example) in a specialized lab, if the students and faculty do not have access to the Web in the lecture hall, and at their homes and/or offices, too.
I am not sure, at this time, if I have an educated answer to this question. I believe that we cannot have all these environment requirements in all learning facilities, although all these technologies exist today. For example, it is not feasible to assume that we should have video conferencing from/to students homes and/or offices, as well as, from/to specialized labs, from/to lecture halls, etc. On the other hand, access to an word processor should permeate all these facilities. I hope that some of my colleagues in these discussions can help in this respect.
I feel it will enable the learning process in an extended classroom (see my answer to the first question) that otherwise would not be possible in the age of "do more with less." Universities are increasingly asked to provide more services and learning opportunities than citizens (and legislatures) are willing to pay for.
The emergence of non-traditional students as the predominant type of students, due to the fast pace of technological change in society, and the consequent need to workers learn a second, third, fourth, etc, set of skills and/or profession in order to keep a place in the labor force, led Universities to engage in life-long education, with declining budgets to support this new mission. For example, the University of Maryland System is supposed to accommodate 20% more students by the year 2,002 (if I recall correctly) with the present level of resources, or even with less than is expended today.
I believe that doing more with less will be achieved NOT by increasing faculty workloads, or even decreasing the size of the faculty, and piling up students in larger lecturing halls, but rather by mapping the extended classrooms in sites other than the lecturing hall. This can only be done with the progress of technology to extend easy access of information and education to the households. The lecturing hall, in a typical two and a half classroom meeting a week, for a 3-credit course, may be used only for one or one and a half hours, with the remaining content mapped to the students households or jobs desktop.
Let me give a simple example of what this means. I had a student two semesters ago that just had a baby and was going to miss the final exam for (as we all know the first 90 days of a baby's life take both parents time day and night) he was in charge of the "night" shift with the baby and could not come to the school to take the test. The existence of e-mail (a simpler technology than we are discussing here) solved the problem. I e-mailed him by 11 PM the final exam and gave him up to 4 AM to e-mail me back his exam.
The same way we all agree today that (almost) everybody can have a word processor at home, I can foreseen the day when most households will have direct access to all University special facilities (labs, data bases, applications) from the household with full graphical and hypermedia capabilities (through TCP/IP or any other standard) at reasonably fast speeds. We have the embryonic abilities to do some of this at low speed, restricted access and non-graphical interfaces at most Universities today.
Let me give a little more sophisticated example to illustrate what we are doing and what we will be able to do with these new technologies. I use to prepare my classes using a software to develop and print charts, that I later transformed into transparencies, made copies for the students, and have them buy or the university to copy for free to them. Next, I was able to have a computer in the lecturing hall with a display panel that allowed me to show my charts without needing to create transparencies, but I still did copies for the students and gave the copies to them.
Today, I still create my charts using the same software, but I convert the charts to graphical files and post them to the World Wide Web, in addition to my syllabus, assignments, etc. I walk in the lecturing hall and using a network I display my class materials for the students. I no longer print my charts -- the students can see these charts in Business Information Center and print or save them to files. Unfortunately, only the students that have invested to join an Internet service provider ($10-$30 a month) can see the charts from home (See my class materials for INSS 640, Information Systems and Technology, at the Web location http://worf.ubalt.edu/~abento/640/640syl.html).
In the years ahead, I am sure, ALL students will be able to access these materials, and many others from home. This new process saves my time in class preparation, managing the copy process, saves University and students' costs with copying and provide the students with full documentation of all my lecture materials 24 hours a day -- YES, more with less.
Finally, I do not believe that robotic education will ever be achieved. Yes, on-line tutorials are in the up-swing. Yes, electronic communications are increasing dramatically, and facilitating human interaction as never before possible -- more than 5 million messages a day are exchanged in the Internet and other networks just in the US. But, teaching, as I said before, is the design and management of concepts and experiences with the objective of allowing the subjects to acquire knowledge regarding a topic, requiring the presence of a mentor and facilitator to succeed. Social theory studies show that motivation and social rewards are the basic ingredients necessary for a student to learn and succeed in his/her efforts to learn. Therefore, the hub that the lecturing hall represents in the extended classroom concept, with a faculty mentor and motivator as the human presence, cannot be substituted in the learning process by any of the new technologies.
Were the first tools created by the cave people beneficial to them? Were they cost effective? I believe so. Human beings have been struggling with their environment and their short comes since then, with one invention leading to another in the pursuit of better and easier ways of living.
No technology exists for the technology sake, specially in a market economy, for their providers would shortly be out of business. Some new ideas and proposals translated into technologies may not survive, or may be only used in later dates, when they are rediscovered. This issue is only relevant if you want to be in the cutting edge, without sliding into the bleeding edge, of the technology.
And if you think that being conservative and using only safe, proven, technologies will solve the problem, please think twice -- the research on the use of information technology for strategic impact is showing that only the innovators harness the benefits of the new technologies. For all others, the new technologies become costs necessary to compete (run to stay in place) without any additional benefits.
Will new technologies be beneficial for the students? This is a non-question, for the students are the future where these technologies will be common place or obsolete, replaced by newer technologies. Henry Ford, more than forty years ago, said referring to one of his new developed automobile models --" If it exists, it is obsolete."
Will these new technologies be beneficial to faculty? I believe and hope so. As one that still remembers the excitement of being one of the first houses in the block to have a (black and white) TV set in the earlier 1950s, of seeing human beings walking in the moon in the 1960s, of assembling his first PC from a Heath kit in the late 1970s, of using computers in the desktop in the 1980s with the power of mainframe computers, of just a decade before, of participating in the birth of the Information Superhighway in the 1990s, I can not wait to see what the 2,000s will bring.
Of course, the issue of faculty development is, more than never, a MUST DO! To my 13-year old daughter, programming a VCR, or installing software in her PC, is easy. She saw me doing these things since she was a little baby. But, this is not true for the great majority of the faculty that grew up and had their education when none of these things were commonplace. No one can expect that faculty who did their Ph.D.s in the 1970s or earlier 1980s, in areas other than Information Systems or Computer Science, to be able to incorporate even the existing technologies in the classroom, without substantial development (carefully planned) on these new technologies.
This leads us to the second part of the question: Will these technologies be cost effective and efficient? In my prior response to new technologies in the households, I also have partially answered this question. Yes, it is the only way we can "do more with less." But, this answer is predicated by two assumptions: (a) we will design and implement university facilities for this purpose, and (b) we will invest in faculty development to use these technologies and facilities.
It is not enough to buy Masseratis, if we decide to save on the tires, and we do not train the drivers -- the cars will be as useless as if we have not bought them. This seems obvious, but I have seen so much of this type of mistake happen, in my almost thirty years as an Information Systems professional, for me to know that this is the major threat in all new technology initiatives. Some universities will be able to "bite the bullet" and do it, while others will not, and will be left behind with bleak futures. I sincerely believe and hope that the University of Baltimore, and the University of Maryland System in general, will be one of the survivors and will leap forward.
How to motivate faculty to produce more with less? Should release time and other incentives be provided to the faculty to use the new technologies?
I can only answer this question if I make some assumptions regarding objectives and goals for the University of Maryland System, and the University of Baltimore in particular. I will describe three possible and competing goals and answer the question in light of each of these possible goals.
The answer is NO! Just increase faculty teaching loads, provide access and minimum training on the new technologies, and let the faculty find ways to do more (handle more students) with less (time for class preparation, grading, etc). In order to be able to take care of the increased load faculty will find ways using the new technologies, or not, to increase their productivity.
The answer is a qualified YES! As I indicated in prior messages in this discussions, the great majority of the faculty needs carefully planned development in these new technologies to be able to use these new technologies to do more with less. But, should everybody receive release time, etc, to develop these new skills and class materials? How can we evaluate and quantify how much release time should go to what faculty members? This is, in my opinion, a "mission impossible", and this is the why of the above "qualified" yes.
I suggest that this decision be decentralized, by department, or groups within departments (sub-specialties). If the System or the University has to produce an estimated number of student credit-hours by 2002, with a given number of faculty members, lets set student credit-hours targets by departments and/or groups, irrespective of the number of sections or courses that each faculty member has to teach, and let the departments and groups decide who is going to receive more or less release time. In addition, lets set corresponding quality of education targets to be sure that the increased productivity is not achieved by lowering the quality of education. This is the concept of quality teams, groups, used successfully in business. Intermediate targets (in this seven-year plan) are very important for both administrators and faculty.
As a faculty member, I feel that, in the last four years, I have been doing increasingly more and producing less, why? Look at this year at the Business School -- I am teaching 25-50% more courses and producing less student credit-hours than before. Will the use of new technologies solve this problem? I don't think so. I also heard that the system is not happy with the level of faculty productivity.
Why not set targets that both faculty and administrators can use to measure the progress towards the final goal? So that, if a group feels that can give more or less release time to some faculty members in order to make their department/group goal, let them do it. If the group feels that more development is need for all faculty members, rather than release time, let them do it. Of course, within the existing budget constraints.
The answer is an unqualified YES! But, the resources for faculty development will have to come from grants and donations. Intermediate targets are still important, as discussed above, but faculty development targets need also to be brought into the picture, including faculty buy out moneys (part-time substitutes) in order to compensate for the initial productivity losses of the release time given to the faculty.
This is the only way to guarantee that most faculty members will upgrade their skills, develop materials based on the new technologies, and increase substantially the quality of education and the faculty productivity.
This page is maintained by Al Bento who can be reached at email@example.com. This page was last updated on April 9, 1996. Although we will attempt to keep this information accurate, we can not guarantee the accuracy of the information provided.