Cyborgs R Us

Like a few million others, I got a new iPhone for Christmas. I think it has 56 gigabytes of memory. I’m eagerly anticipating the release of the Apple Watch in a few months. I’ve promised my daughter I’ll buy her one and doubtless I’ll get one for myself. Then I’ll be a full on cyborg with my heart rate, step and activity patterns, sleep (or not-sleep is more like it) record, and I can’t imagine quite what all else (but I want it!) will be recorded and readable to me (and maybe millions of others, but I can’t comprehend why they would be interested) on my iPhone, my iPad, my MacBook Pro with its retinal display (which I’ve never quite understood, but know it is “good,” no “better than good … as in great”), and my Mac (sitting there on my desk all lonely because it can’t get up and go). I’ve had a FitBit exercise monitor for years and have dutifully entered on my FitBit webpage every morsel I’ve put in my mouth; never mind that I’ve gained weight despite never (in almost 3 years) having had a day where my caloric intake was more than my burn. Crap! I step naked on my FitBit scale every morning and the results (not just weight, but also percent body fat … however it knows that) are automatically sent to all my devices. Every morning I get a cup of coffee and sit down to check my stats … and then fire up my financial tracking program to monitor my “total net wealth” (thankfully it is above zero). This is 2014 and I’ll soon be 72. Continue reading

Religion Writer

The study of religion is bound in the often uncomfortable tension between opposing positions and forces.  It seems we would need to know what religion is in order to study religions, yet how do we know what religion is without encountering religions.  How do we state what we know about religion without predisposing these definitional and categorical statements toward specific “prototypical” religions?  Indeed, I think it is fair to say that the current study of religion is based heavily on Christianity being the prototype, yet tacitly so.  Religion, in perhaps the most common sense experience, is loaded with non-language experiential bodily phenomena, yet the study of religion seems tightly bound, almost exclusively so, to language phenomena (scripture, philosophy, doctrine, description, history, and other academic studies).  Academic methods, including academic writing conventions, demand objectivity and scorn subjectivity and feeling and emotion.  Academic methods are restricted to the mind and ignore and discount the body.  Yet, extensive research during the last half century has increasingly supported the position that conceptual and propositional thought, even reason itself, is based in subconscious sensorimotor patterns, schema, and meanings. Continue reading

A Horse is an Automobile without Wheels

September 8, 2012

In memory of Kenneth Morrison

Thirty years ago I published a book titled Beyond “the Primitive:” the Religions of Nonliterate Peoples (1982) that was intended to establish some less biased position or stance from which to appreciate and understand folks living in small-scale cultures, tribal or, what was for some time called “traditional,” peoples.  While studying at the University of Chicago, I found that much of the heritage of the academic study of religion was established in the study of what was called “primitive” people and in those days there was only a nascent awareness of the inappropriateness of this term.  It was the primitives that told us how religions got started in the process of human cultural development and the issue was variously framed in evolutionist terms (in which case magic preceded the rise of religion) or essentialist terms (in which religion, being essentially inseparable from divine creation, existed in the earliest of times found in “primitive” cultures evidenced by the presence of a “high god”).  In a fascinatingly illogical position contemporary people who live in small cultures were considered to represent these ancient people, the people “of the beginning.”  My teacher, Mircea Eliade, perhaps the most influential religion scholar of the twentieth century, was a major proponent of this approach, constructing his influential understanding of religion, one still present in popular understandings, by exemplary studies of “primitive people,” especially the aboriginal people of Australia.  Decades later my book Storytracking: Texts, Stories, and Histories in Central Australia (1998) attempted to place this approach in a constructive and comprehensible context (or history) if a critical one as well. Continue reading

That Little Thing

Completing my undergraduate major requirements in mathematics before I was a senior, I had grown impatient with mathematics largely because it seemed to me at the time so isolating from people.  I was utterly naïve of course and had become a math major simply because my mom told me to do so believing, why I have no idea, that with a degree in mathematics I’d surely get a good (meaning well-paying) job.  I took a course in business administration from Professor Larry Jones.  He was a tall clean-cut pipe-smoking intellectual-looking man.  On one assignment I was asked to indicate what I would do in a particular business situation and the situation was extensively described.  I wrote my paper on the many reasons I was sure that I would never have gotten myself into that situation in the first place and would thus not have to deal with the thorny problems clearly present in the situation.  He gave me a “D.”  I was not, however, discouraged because I caught a glimpse of a sea change that was at that moment taking place—this was the mid-‘60s—a shift from a behaviorist perspective to methods of quantitative analysis supported by the introduction of computers into the business environment.  I realized that with my background in mathematics I was rather well placed to put this to use in the business environment.

Entering a graduate degree program in business I found myself positioned with the right stuff at the right time.  Professor Jones had landed a senior administrative position for the Coleman Company—he would later become president of the company (and would later run for governor of the state)—and he was not so put off by my performance as to recognize that I might have something to offer even if I could be counted on to argue with premises.  I was hired in a research position at the Coleman Company, I somehow landed a position teaching a class in the Business School at Wichita State University on quantitative methods, and I was a full time graduate student.  I enjoyed many privileges and opportunities in these interrelated capacities, even though I was a bit busy.  Financial and business success seemed completely assured.  All was smooth sailing I felt and I totally loved everything I was doing.

We never know what comes next in life, what little thing might happen on any day that will alter the course of our lives, or so it would seem anyway.  As I had been impressed by Professor Larry Jones and did all I could to learn and be inspired by him, I admired even more Professor Harry Corbin.  He had been a young president of Wichita University, a municipally based school when I entered in 1960, and in the early ‘60s he ushered through the Kansas State Legislature the necessary measures to have the university accepted into the state system of higher education under the new name of Wichita State University.  As an undergraduate student leader in numerous capacities I had had many opportunities to observe President Corbin in action.  He too was a tall handsome reserved quietly powerful clearly brilliant man.  He seemed to me the very epitome of an academic.  Once the school had been accepted into the state system, Corbin gave up his presidency and returned to his research and teaching.  He had studied political science at the University of Chicago and was deeply interested in religion.

During my graduate studies in business, I learned that Professor Corbin was teaching a course on world religions.  I knew it was likely similar to history and I was concerned about that since the only “C” grade I had ever gotten was in a world history course.  Still, I truly wanted the experience of being in a class taught by Professor Corbin.  My business advisor allowed me to take the course, most likely because Corbin was so respected that it would be unacceptable to suggest that his course would not contribute to any student’s work.

So there it was.  That little thing.  A course that didn’t fit my program, taken for personal reasons.  Once the course started, it didn’t take long.  I have often described the experience I had in that course as like discovering a door theretofore unknown to me that when opened revealed the enormity of a world I didn’t even know existed.  Small town Kansas education is not all that worldly, for sure, nor are studies in mathematics and business even in a state university.  But here it was … an enormous rich complex confounding luscious world of peoples in era after era and culture upon culture.  That little thing had suddenly turned into one of almost unfathomable dimension.

Though my work and study and teaching were all exciting, the success I experienced in all of them was perhaps the greatest wedge to the need for change.  In the research capacity I enjoyed at Coleman I was centrally involved in replacing groups of working people with computer applications.  I saw upper level executives forced into early retirement because they couldn’t adjust to the tsunami of computer technology.  My satisfaction with a job well done, with my role in facilitating the march of technology, was tempered by my experience of the human costs I observed on people being displaced and outmoded.  As my power and accomplishments grew, so did my doubts and concerns.

Somehow I got the idea that I would benefit from a brief sabbatical from business to reflect and regroup.  I sensed that the timing was crucial, because I could feel that I was quickly approaching a point of no return.  My superiors at the Coleman Company were sympathetic to the idea, so I set about considering what I might do for a while to beneficially fill a leave of absence.  This took me to Professor Harry Corbin.  And this is actually the part of the story I want most to tell.

I met him in the office he had occupied as university president, retained I’m sure as a way of honoring his considerable contributions.  It was handsome and elegant and simply made one feel important to be in.  I explained my situation to him and asked his suggestions.  Corbin said, “Well, I’ve studied off and on for decades at the University of Chicago.  Might you consider that?”  I very clearly remember asking, “Oh, do they have a university in Chicago?”  He assured me that they did and that it might be worthy of my consideration.  Since I had studied world religions with him, I thought that might be fun to continue those studies and asked if that would be possible there.  He indicated it was and referred me to the Divinity School.  The story gets better, or perhaps worse.

Knowing not a thing about it, I contacted the Divinity School and asked for an application for admission.  I received it and filled it out and sent it back.  I was informed that I’d need to take an entrance exam and that they had their own exam which would be sent to Wichita State where it would be administered to me.  I remember taking the exam, but absolutely nothing about it.  This whole thing was premised on my firm belief that I’d be there just a few months.  I was then notified that I had been accepted to the Divinity School and was asked what field I wished to study.  I wrote back to ask them what fields I might choose among.  They sent me a list and I really didn’t recognize much of anything on the list so I selected “Christian Theology.”  They wrote back indicating that that field had filled, but might I be interested in a field called the “History of Religions?”  Even with my concerns about studying history, I knew it really didn’t matter, short termer as I was planning to be, so I responded, “sure.”  And that is how I entered the University of Chicago and a profession that is still unfolding over more than forty years.

My Great Awakening

for Alex Perry

I’d been sitting across the desk from him for what seemed an eternity.  He was hunched over my paper commenting on every one of the dozens of red notations he had written there.  We were still on the first page.  Jonathan Smith, “You describe Dwight L. Moody as ‘infamous’.  Do you have any idea what that word means (not waiting for me to answer)?  You should never every use that word to describe such a figure as Moody.”  Why didn’t I just get up and leave?  I had slid down in my seat to the point I was about to fall onto the floor … well this was perhaps more the description of my self-esteem than my physical body.

It had all started just a little over a week before.  I’d conferred with another professor in the Divinity School at the University of Chicago where I was a new student.  How I got there is another story, but needless to say, with an undergraduate degree in mathematics and a graduate degree in business administration and only one religion course on my transcript, this was not a place where I felt at home.  I was a floundering homeless academic living under an overpass, a high speed highway travelled by my classmates who all had graduate degrees in religion or history or language.  This professor had asked me if I’d yet worked with Professor Smith.  Learning that I had not he directed me, seemed a command actually, to contact Professor Smith to arrange to work with him.

Dutiful and responsible if nothing else in this graduate program, I mustered my courage and made an appointment to meet with him.  When I walked into his office he seemed barely to notice me, but eventually asked my business.  I told him that I had been referred to him by another professor and I was there because of that.  “Hmmm,” he said looking at me quizzically, “so why would you be the sort of person I’d want to work with?”  Oh wooo!   I had no answer whatsoever for that question.  I can’t even remember what I did, but it surely was little more than to stand there with a dumb look on my face.  Finally, he said, “Well okay then.  Write me a paper and leave it next week.”  I muttered some sort of agreement and left.

I don’t know why I chose to write on Dwight L. Moody and late nineteenth early twentieth century revivalism, but that’s what I choose.  I’m guessing the paper was 12 to 15 pages long.  I dropped it off the next week and made an appointment to meet with Smith in a couple days to get his response.

That response was, as I have described, nothing short of a bludgeoning.  I felt humiliated and stupid and grilled and belittled and hammered and embarrassed … just to begin the list of my feelings.  However, I sat there and listened and took notes and tried to keep from crying.  Certainly in this fog of emotions I was experiencing there were some thoughts of what I might do with my life given this state of failure.  Yet, then a voice, Smith’s voice, that now seemed so very faint and far away somehow penetrated my awareness.  As he stood up extending me the paper he said, “Not a bad paper really.  Revise it and have it back here next week.”

As I found my way outside of his office I experienced the strangest sequence of changes and awakenings.  Did he just say, “Not a bad paper?”  Did he just ask me to revise it and get it back to him?  Surely this means that he hasn’t sent me away for good, drummed me out.  He wants a revision!  Oh my god, it wasn’t that bad!  As I walked along it suddenly dawned on me that I had just had my first real learning experience.  This man thought enough of my work to take it deadly serious down to my every choice of words.  It mattered to him what I wrote and mattered in the greatest detail.

It was a moment of awakening and transformation.  To have someone take my work seriously enough to give it the full measure of criticism in service to my learning, my education, was something I’d never experienced before in this way.  It was my first true learning experience and I knew that from that day forward I would take myself as seriously as had Smith.  He had somehow seen something in me I hadn’t seen in myself and that isn’t the way it should be.  Not only did this experience set the course of my education, it set the course of my career as an educator.

Thoughts on Considering New CU Religion Programs

Not since Gutenberg has a revolution in media impacted the world as greatly as e-media are today.  Although invented in the fifteenth century it took many decades for the impact of typography to be widely felt, yet it is clear looking back that the world changed in fundamental ways as a result of this change in media.  Today e-media are developing so rapidly that we experience a barrage of change approaching chaos.  Alvin Toffler’s publication of Future Shock just 40 years ago (a major book when I was in graduate school) described as “shock” the popular experience of rapid change, the future seemingly slamming into the present.  Yet, compared with today we surely think of the ‘70s as a rather lazy decade.  Here are some statistics that suggest something not only of the order of change in the present world, but also of the measure by which it is being accepted and incorporated in lifestyles the world over.  The number of songs available on iTunes in 2007 was 3.5 million; today there are 6 million plus 65,000 podcasts, 10,000 music videos, 20,000 audiobooks, and 500 movies.  In 2007 Wikipedia had 4 million entries; today there are over 16 million.  Facebook was not significant in 2007; today Facebook has half a billion (yes, 500 million) active users and includes 900 million objects.  YouTube had 6.1 million videos in 2007; today there are 120 million.  In a mere 3 years Wikipedia has grown 400%; YouTube 2000%; and Facebook at an incomprehensible rate.

Over a decade ago our religion colleague Mark Taylor wrote, “We are living in a moment of unprecedented complexity, when things are changing faster than our ability to comprehend them.”  And he brought this message directly to higher education:  “The same information and telematic technologies responsible for the shift from an industrial to a postindustrial economy are bringing higher education to a tipping point where unprecedented change becomes unavoidable.”  His analysis is interesting, yet almost amusing or endearing when one reflects that at the time of his writing, Taylor had never heard of Facebook, or iTunes, or Wikipedia (launched in 2001), or YouTube, or texting, or social networking.  Taylor also noted the enormous resistance he experienced among academics to even consider the impact of the inevitable; a position I’m guessing has not changed at all.

Not only are e-media creating unprecedented change, so also are global economic forces.  It is becoming increasingly clear and widely accepted that rather than a “V” shaped recession (a decline followed by a rise) or even a double bounce “W” shaped recession, the more likely letter to describe this economic situation is “L” in which the decline, once it reaches bottom, does not bounce back to previous levels, but rather remains flat for quite an extended period.  Housing, financial reform, health care, global warming, wars and conflict, the growing divide between wealthy and poor, and the hostile polarized political climate—all experienced widely in the world—seem to support the “L” shape in economic patterning.

As a result of these major influences on society and the world it is highly unlikely, perhaps even not all that desirable, for the university to simply return to what has been so familiar since Kant set forth the principles on which the modern university took shape and has pretty much persisted without major change for over 200 years.   Virtually all futurists indicate the high likelihood of major structural changes in a system that is prohibitively costly to operate and that has become almost totally dependent on funds from state and federal governments, business and industry, and charitable donors.  It is increasingly clear that this system cannot continue long to persist without fundamental change.

I think that the recent decline of newspapers offers a parallel that might be applicable to the inevitable revolution (though few are ready yet to acknowledge this) of the presently structured campus-based university.  Consider a few possibilities.  Let’s say that university faculty begin to package their courses as on-line courses and they begin to shift their work from a single campus base of operation to a world-wide audience.  Of course, on-line courses are already quite common and have been for at least 15 years.  It is thus only an incremental step before on-line delivery becomes the principal method rather than a supplement to classroom delivery.  For-profit universities are developing on the basis of e-delivery of much of education  Let’s say that state legislatures continue to be financially stressed as they have been now for some years.  It doesn’t seem that the end of state government financial stress is yet in sight and the structural changes being presently made are changing most universities in fundamental ways (ways that may well be beneficial).  Let’s say that family income available for higher education continues to be in short supply so that an increasing percentage of families shop for lower priced, yet still high quality, education for their children.  There is currently an explosion in enrollment at community colleges, evidence that this trend is well under way.  Let’s say that studies begin to show that for most students for many courses they take, their goals of education can be effectively met by on-line courses.  If faculty are not limited to a single institution, the very finest faculty in the country and world in any subject could produce the bulk of the on-line courses.  Who wouldn’t prefer to learn from the finest in the world, even if via e-media?  There is further advantage in that such courses are available to learners inexpensively and accessible at any time from any location.  Perhaps these studies will show that students may benefit from one semester residence for every two years of traditional campus based learning (face to face learning will long be valued), with the rest being done effectively at home or while working or performing community service or even traveling.  Should students spend but 25% to 50% of their educational time on campus, the economies of operating physical university plants could be greatly reduced.  Campus colonies (as Thomas Frey has suggested) will develop to use some of this unused space enabling a learning while working community led by faculty and non-university professionals directed towards a work/learn environment that actually creates products, performs services, and so forth incorporating learning as an essential dimension.  The strong interdependence of the sciences and business and defense already demonstrate the success of this model.  In this context, small liberal arts colleges might persist in which the best of faculty and students can research, learn, engage totally independent of immediate needs, retaining some small pockets of the classical understandings of higher education that have already given way as universities have become producers of workers.

It seems clear that the tipping point conditions for fundamental change have already been met, yet the surprising conservatism of university educators will fend off the inevitable impacts as long as possible, perhaps to disastrous results.  The point really is not to despair and read the situation pessimistically, but rather to see that in any situation of change there is great opportunity.

As the Religious Studies faculty begin discussions about the development of new programs it would seem that these factors may be relevant.  It would seem foolhardy to initiate any new program at this time without some careful investigations and considerations.  Minimally these would be:

  • What is the likely future of higher education and the impact this changing context will have on considered programs?
  • What is the future of the study of religion particularly in the context of the likely changes in higher education?
  • What are the motivations, goals, desires, contributions held by CU religion faculty for any proposed programs?

I am unaware that the CU religion faculty has made any effort to look at the future of higher education and its potential impact on the study of religion.  This is why I provided the few sketchy paragraphs at the beginning.  With Jonathan Smith we initiated a discussion of the future of the study of religion over the next 40 years.  I understand that Greg was pursuing the publication of Smith’s lecture.  To my knowledge no one on the religion faculty has otherwise engaged any further consideration of what Smith presented?  I have reshaped the writing course to consider these issues and both of my spring course offerings are directly developed in response to Smith’s lecture.  Of course, while Smith may be in many ways the most important person to chart the future of the study of religion, his resistance to any technological innovations makes him perhaps the least insightful in these areas.

As for CU faculty motivations to develop new programs, the one I have heard most persistently is that “the dean told us so.”  I’m guessing then that self-preservation is among the most basic and fundamental motivations for program expansion.  As the only department in the college without a PhD we seem vulnerable to elimination and we know that we have recently been short-listed in the administration’s budget cutting considerations.  The question is, would we be in any safer position as the newest, smallest, least credentialed PhD program on campus than we currently are without a PhD?  If self-preservation is a major motivator, are there not a number of other strategies that might better secure our future?  I think, for example, of creating a number of courses that focus on considering the central and essential role of religion in the world today.  I recall that Walter Capps drew 1,000 students to his course on Vietnam at UCSB.   It is clear that religion plays a decisive and central role in almost every area of conflict and concern in the world today.  To create a series of courses taught to large numbers of students would, I would think, almost certainly assure our future.  E-media are inevitable innovations in the future university.  The humanities are the last to appreciate the potential values of these areas.  However, it is totally possible.  Our program in the mid-1990s called TheStrip was many years ahead of the times in e-media terms and yet has not been followed by anything else.  One thing is quite clear and that is that religion is a sensuous-rich aspect of life.  Until now this aspect of religion has been relegated to coffee table books and films.  It is clear, even if we acknowledge several of Smith’s predictions, that incorporating experience, the senses, etc will almost certainly be persistent areas of expansion in the study of religion over the next 40 years; these pair well with e-media and the e-delivery of courses.  Aggressive innovations in this area would, I think, also assure the department’s security in the university as well as offer engaging creative challenges for faculty and students.  These actions would not only more strongly assure our future, they would also contribute to the developing future of the study of religion and the university.  For my taste and interest, I’d be most interested in seriously engaging in futurist studies of the university and the study of religion and taking bold actions as leaders and innovators in that future.

It is perhaps worth a moment to indicate some reservations I have about creating a more or less standard PhD program in religion at CU.  Foremost, I strongly believe that the era of faculty simply cloning themselves in the next generation is simply over.  Next, I’m concerned about the faculty credentials for the program and the impact of this situation on the program.  Should we not count (and last spring I was painfully made aware that at the moment we are largely bean counters where research is concerned) the book publications of Rodney, Ira, and me (which total somewhere in the area of 25 to 30 books), the number of books published by the balance of the faculty to my knowledge is rather small (I know of 2, but there may be more).  Can an active PhD faculty even begin to argue for high research stature and credibility when the book publication count is so low?  Would any of our current faculty choose to attend a program whose faculty had so few publications?  Next concern, students.  My sense is that a CU PhD program would draw large numbers of students.  Basically many of our current MA students that don’t get accepted to other PhD programs will want to continue at CU for a PhD.  Frankly it is easier to live on student loans than on unemployment especially when you haven’t been employed.  On the other hand, I think it would be rare indeed to attract to CU a top-notch student in any field.  How could CU religion PhD program possibly compete with long-established programs with highly accomplished faculty.  Thus, our faculty will be working with potentially quite a number of students, but, my question is, are these the students you want to be working with at this level?  Is this the best most creative use of your time and energies?  Finally, I have very serious ethical issues about producing PhDs from a fledgling program in an already over-crowded academic field knowing that their chances are small to none of getting an academic job of any kind.  Placement records are important and public.

In my view, it is not appealing to be identified with and to experience the struggles of perhaps the weakest PhD program in the college and perhaps the weakest PhD program in the international arena of the study of religion.  I should think there are far more interesting, creative, innovative, engaging, fun (and even secure) ways of doing our jobs.

Writing is Gesture: Leroi-Gourhan

In his fascinating Gesture and Speech (1964, 1993) André Leroi-Gourhan traces the development of alphabetic writing considering it a distant consequence of the upright posture that allowed the hand to be free to draw the focus of communication from the face to the techniques of the hand.  The face and graphics are more strongly associated with mythology while the eventual development of alphabetic characters arranged in a linear stream precipitated the emergence of rational thought and philosophy.

There is much of interest in this way of understanding writing and its emergence because it centers the importance of alphabetic writing in the body, particularly the hand, and it identifies alphabetic writing as gesture or technique.  My reading of Leroi-Gourhan indicates something like a simultaneous development of writing gesture and the privilege of reason and thus philosophy suggesting, importantly, that body, that technique, that gesture, is linked at least as much with agency as with expression.  In the long history of human development we think as (not just what) we write and vice versa.  While I think Leroi-Gourhan understands that alphabetic writing and reasoned thought co-developed, my reading of him suggests that he understands writing in more of a utilitarian fashion, as a tool of memory, rather than as a tool of creative and explorative thought.  Clearly the evidence of what the earliest writing was about supports this understanding.  However, this leaves untold the story of when and how writing came to be a creative active powerful heuristic imaginative process, surpassing and complementing the functional characteristics of recording and documenting.

As I am in the process of critiquing academic writing conventions, these observations of Leroi-Gourhan are important.  It is clearly notable that a culture’s or community’s techniques of writing correlate with the way that community or culture thinks and engages the world.  Writing conventions that prohibit first person pronoun, that proceed along a reasoned and factually supported argument from thesis to conclusion, inculcate and reflect a reasoned and linear and factual mind and a world that is similarly ordered and related to.  These conventions correspond with a world that is orderly, firmly based on fact, and that can be understood and fully comprehended by the proper use of reason and objective observation and description.  These conventions correspond with a world where reason reigns and emotion, intuition, experience, subjectivity are excluded as distracting.  Yet, Leroi-Gourhan shows that alphabetic writing is inseparable from the bodily posture of walking upright, inseparable from the hand gaining some role in communication otherwise centered in the face.   Writing is, as Leroi-Gourhan shows us, gesture, a technique of culture and history.  The conventions of specific forms of writing amount to unconsciously used gestures that instill value without ever articulating the value.  The agency of the conventions is in the repetitious unquestioned practices of the body, the hand stringing alphabetic symbols across a blank page under specified constraints.

Leroi-Gourhan, writing in the 1960s discussed the impact of audiovisual innovations as suggesting a shift in gestural practices that he believed might well spell the end of writing.  He explicitly discussed film and audio recordings.  He focused largely on mechanical production and reproduction with hints of electronic media.  Little could he have imagined the developments that have occurred in the last half century.  What might Leroi-Gourhan have thought of the uncontrollable expansion of the Internet, the availability of 6 million songs on iTunes along with 10,000 music videos, 20,000 audio books, 65,000 podcasts, and 500 movies.  What would he have made of YouTube with its 120 million videos (up from only 6 million 3 years ago, only I say!), or Facebook with 500 million active members?  These too are founded in gestures that demand greater analysis than we have yet given them as techniques of culture, especially the bodily implications of these media forms.  Surely, these new media mark ever more strongly than did Leroi-Gourhan the end of writing or at least the necessity of its radical shift.  The interactive and relational character of the Internet engages gestures that defy the values insinuated by academic writing conventions that have changed little in a couple thousand years.  And the extensive research that convincingly grounds mind, value, and meaning in gesture, body, sensorimotor patterns, and bodily movement offer equally powerful challenges to these same conventions.

The future of writing surely lies in that aspect of writing that is a technique of creativity and imagination and exploration.  Until now we have relegated this technique to the sideline where we place art and entertainment; the challenge is how to embrace this technique while retaining some semblance of what we understand as academic.

Sam Gill – August 26, 2010

Education is Not Information

The future of education must carefully and critically question the current broadly held understanding that education is information.  Late in the twentieth century research began to demonstrate conclusively that meaning and value are based in and founded on bodily experience.  The western Cartesian perspective separates mind and body (and experience, subjectivity, and emotion).  In many ways information is disembodying and body-denying.  Even though we may know lots of facts and bits about something, this information does not a rich experience make.  Information is information about and about signals object and distance.  Even while advertisers and information providers identify very closely our personal individual information interests (most usually without our even knowing it is being done)—suggesting a subjective and experiential development—the information still stands apart from experience, from body.

This trajectory towards education as information  marks, in my view, the greatest threat to education and thus to human value and the quality of human life.  The difficulty in an electronic information age is how to link learning with experience, particularly bodily experience.  While there are endless possibilities, a couple are worth a mention.  The colony model suggested by Thomas Frey puts students, faculty, and professionals together as workers, developers, producers, investigators, creators, contributing to society as they experience learning while actively pursuing practical goals:  a film, a product, a service, new knowledge.  Another possibility is the by-product of the scale of efficiency of e-learning.  If most of the traditional time in classrooms and on campuses were spent e-learning, the time needed to learning would potentially be less than that required for the more traditional learning methods.  Time would be saved since the learning would be individually paced; once a student has learned a bit of knowledge she may move on.  Time would be saved in the efficiency of having the active learning tools available to students at any time and place, rather than being restrained to scheduled classrooms and class times.  Thus, with less hours spent learning the same materials, with facilities freed from the exclusive inefficient use of classes, this time and these spaces could be devoted to a wide range of bodily-based activities:  intramural sports, yoga, dancing, fitness, and so forth.  I’ll need to write before long on the types of brain/body activities that create the greatest potential for brain/body acuity.  These need to be present in any learning environment.

A further option would be a transformation in the practice of writing.  Currently university writing practices are consistent with the out-dated objectivist linear understanding of learning.  Current university writing conventions generally abhor any significant presence of the author, of subjectivity, of experience, of emotion.  Given that it has become well established that experience, emotion, movement, sensorimotor patterning, gesture, etc are fundamental to all meaning, future university writing conventions need to change.  The challenge will be to create conventions and expectations in which academics—faculty and students—are writing the body, writing the moving body, writing experience, etc while continuing to be academic in the sense of creating and establishing generally applicable knowledge, principles, ideas, concepts, etc rather than simply personal trivia or even art.  To rise to this challenge is an exciting prospect.

Sam Gill, August 26, 2010

How to Save the University: Lessons from the Possible Saving of the News

We are all well aware of the decline of the newspaper industry. A number of factors have contributed to this decline:  shift to on-line news sources, decline of newspaper advertisers who find other media are more cost effective, a shift to news packaged and presented as politics and entertainment, and the inefficiencies of news agencies based on a daily or regular paper medium whose news is necessarily no longer news before it is physically available.  It is generally understood that Google and net-based services have contributed to the decline of newspapers.  However, as discovered and reported by James Fallows in his The Atlantic article “Inside Google: The Company’s Daring Plan to Save the News (and Itself)” (June 2010), Google has initiated a range of actions in the direction of saving the news if not the paper forms with which it has so long been identified.

I think that the recent decline of newspapers offers a lesson that might be applicable to the inevitable decline (though few are ready yet to acknowledge this) of university (and other levels) campus based education.  Consider a few possibilities.  Let’s say that university faculty begin to package their courses as on-line courses and they begin to shift their work from a single campus base of operation to a world-wide audience.  Of course, on-line courses are already quite common and have been for some years.  It is thus only an incremental step before on-line delivery becomes the principal method rather than a supplement to classroom delivery.  Let’s say that state legislatures continue to be financially stressed as they have been now for some years.  It doesn’t seem that the end of state government financial stress is yet in sight and the structural changes being presently made are changing most universities in fundamental ways (ways that may well be beneficial).  Let’s say that family income available for higher education continues to be in short supply so that an increasing percentage of families shop for lower priced, yet still high quality, education.  There is currently an explosion in enrollment in community colleges, evidence that this trend is well under way.  Let’s say that studies begin to show that for most students for many courses they take, the goals of education can be effectively met by on-line courses.  If faculty are not limited to a single institution, the very finest faculty in the country or world in any subject could offer the bulk of the on-line courses.  There is further advantage in that such courses are available to learners inexpensively and accessible at any time from any location.  Perhaps these studies will show that students may benefit from one semester residence for every two years of traditional campus based learning, with the rest being done effectively at home or while working or performing community service or even traveling.  Should students spend but 25% to 50% of their educational time on campus, the economies of operating physical university plants could be greatly reduced.  Campus colonies (as Thomas Frey has suggested) might develop to use some of this unused space enabling a learning while working community led by faculty and non-university professionals directed towards a work/learn environment that actually creates products, performs services, and so forth incorporating learning as an essential dimension.

No one of these potentials is unlikely and most are not only likely but would result in high quality education for far less cost to both states and other education deliverers and the families paying for education.

Whereas the newspaper industry suffered decline and devastation, with the help of Google and others, it may find a way to transition into a better, stronger, news industry.  Hopefully, universities and higher educators will initiate the creative transition to new models and practices.  This can happen only if educators actually think to the future and take action.  I see little evidence that either is a present concern.

In this proposed model it would be possible that some few students will still find valuable a full four-year campus experience.  These would be the students who would pursue the more classic liberal arts understanding of education, that is, education understood as cultivating a fully minded-bodied person rather than education as information or information processing.

In this model the economy of scale would afford faculty greater independence, autonomy, opportunity, and support for research in the more traditional sense of inquiry motivated primarily by curiosity and experimentation, rather than research motivated largely by industrial and defense demands.  This is the kind of research that can produce entirely unexpected results.

Sam Gill – August 26, 2010

Future of University – Writing Conventions

The Future of the University:  Writing Conventions                                          August 24, 2010

It is somewhat confounding to me that while one of the most important images and charges of the university is that it represents the freedom to experiment and think, privileged and enabled by a separation from society, it appears in some respects to be among the most conservative and protective of societal institutions.  Much of the scientific research has already given over to being contracted by business and government.  The humanities however seem to be in a plodding phase where research is conducted on models established decades, if not centuries, ago.  And teaching methods have changed little.

As Marshall McLuhan and others have shown, there is a strong interplay between medium and message.  As futurists and educators evaluate the powerful shift toward electronic media, there is an accompanying discussion of the future of education.  My readings of this discussion indicate that when electronic media are considered, everything in education turns to information.  The assumption is that what e-media deliver is information.  I suppose this is conditioned by the Google and Wikipedia and Facebook mentality that allows us to find out something about most anything we run across.  I totally love this aspect of the Information Age.  However, having information does not an education make.  The university, in my experience, has over the last twenty years steadily drifted towards the understanding that education is information processing.  This is not a healthy direction for it supplants an information portal for a school.  Back to that in a later writing.

At the moment I want to think about the standard writing conventions that are expected in most universities.  While I think there is a shift to simply using writing to “report” on information gathered, there remains that standard linear narrative: thesis, argument, conclusion.  The disembodied posture (enough said) places this narrative as some object in the world that can and should stand alone.  This academic writing convention is, in some sense, the hallmark of the university as it has been for a very long time.  The accompanying qualities of these writing conventions—disinterested, disembodied, objective, unemotional—have far-reaching implications for us.  While the university seems uninterested in even budging on its writing conventions, its own research findings have proven the error of the human qualities so strongly engrained in the university’s most cherished  gesture, its writing convention.

From my perspective the reluctance to experiment with, reinvent, play with, try out alternative writing conventions is, along with the limitations of the furniture and architecture, the most oppressive and limiting features in the contemporary university humanities programs.  Thomas
Frey introduced an interesting situation that I’d like to extend beyond his use as an analogy for how our writing conventions limit our research and world.  He noted that the ancient Romans were highly limited by their number system—Roman Numerals.  While this system works satisfactorily for tracking some things—I suppose keeping track of years is the main example, since we actually continue to do so—it clearly doesn’t work well for keeping one’s bank statement or for the calculations necessary for rocket science.  While the Romans understood their number system as wholly adequate, they had no idea of the limitation this system imposed on their world.  It is rather, as Frey suggests, like asking a fish to understand the limitations of the water in which they swim.  I think also here of that fascinating book Flatland: A Romance of Many Dimensions, the 1884 satirical novella by the English schoolmaster Edwin Abbott Abbott , in which we, by analogy understand the possible limitations of a three dimensional world.

The analogy I’d make is that our current writing conventions, especially when their limitations and determinations are not even acknowledged, function similar to the way the Roman numeral number system did for the Romans; it limited their universe and experience in ways they couldn’t even detect.  Academic writing conventions simply limit the world we are able to see and experience and investigate and they do so in ways that we cannot even see or know.  It follows then that what we need is a revolution in academic writing conventions.  In the midst of this far-reaching electronic media revolution, this effort may well be the only way that university humanities programs will survive, that the idea that education is anything more than information processing may survive, that the very idea of the liberal arts educated human being is important might survive.

How can this revolution in writing conventions be achieved?  How can a fish even “know” about “air” much less appreciate its importance (the parallel to evolutionary history might well be of interest here)?  I suppose there are a number of strategies (and I’ll want to write more before long on creativity, play, hypothetic inference) but there are a couple obvious directions.  First, we might simply ignore the conventions while continuing to write and see what emerges.  Second, we might take a Janus perspective and look back that we might see forward.  Here, we’d need to ask what is writing.  Why do we write?  What does it matter whether we write or not?  When did we start writing?  How does writing relate to identity? To agency?  Third, we may already have some hints, indeed some very strong indicators, about who we are as human beings and what actually matters to us:  passions, feelings, emotions, movement.  We may already understand that these qualities and values function even hidden behind the standard academic writing conventions.  We may recognize that these qualities are inseparable from being embodied human beings and that being bodied is an essential aspect of our identity and agency.

“New writing,” if writing at all, must be shaped by who we understand ourselves to be as human beings, as academic beings, as valued human beings and also by who we want to become, by how we want to impact the world in which we live, by how we want the world in which we live to look.  “New writing” must be seen as agency and action as creation as personal as powerful, not simply as some passive mechanism by which to connect information sources with information demands.

“New writing” is not, in the rapidly emerging e-world, some nice little experiment in cleverness initiated by idle academics bored being removed from the real productive world.  “New writing” is, as I see it, an essential task to save the humanity in a world that is hell bent on transforming everything into information and every being into an information processor.  Not only do we lose something important of our humanity in this process (like our humanity), we also truncate our human potential.  We even lose the point of information: Why information?  Through “new writing” we may realize ourselves; we may recreate ourselves to be powerful and creative in the new world that is arising around us.

As a teacher of writing I am at the point of admitting that I am a fish in water, but that, through analogy and imagination, I have begun to imagine that the water in which I swim may be depriving me of a type of oxygen, a vitality, whereas I thought it was the source of life.  I can imagine breaking through surfaces and finding new, now unimagined, vitality and richness, yet, I am confined in some ways (how can I even know?) by the gestures and postures that comprise me.  While I’ll do my best to realize “new writing” as fully as I can, I think of my students more as the tadpoles with legs who are about to rise to the surface and walk out on new lands into new atmospheres.  My students are the ones that have the greater potential to explore and create new heretofore unimagined worlds through their creation of new writing systems.  I’ll nip their heels to force them forward.