Thursday, September 30, 2010

Installing Intense Debate

One of the difficulties with blogging is that it is difficult sometimes to keep track of where you have commented and to remember to check back in case the author has replied.  One tool that helps manage this difficulty is Intense Debate.  This is a replacement for the default commenting system on your blog, and it includes some nice features.  You can have comment threading (so you can reply to a specific comment), get notified when someone replies to your comment, and integrate with Facebook and Twitter.  Here is how I installed Intense Debate on this blog:

John Locke and the Internet

Bill Cheswick -- Internet Mapping Project
To understand the Internet, let's look to John Locke, writing in Of Civil Government:
To understand political power correctly, and derive it from its origins, we must consider what state all men are naturally in: a state of perfect freedom to order their actions and dispose of their possessions as they see fit, within the bounds of the law of nature, without asking permission or depending upon the will of any other man.

Wednesday, September 29, 2010

Ten Ways to Bring Books Into Your Digital Life

Just because books are a legacy knowledge format doesn't mean you shouldn't make good use of them. The digital realm is reviving and repurposing books (just as those classical books from antiquity got a brand new life during the Renaissance when put into print). Books are more important than ever!

So, if you want to be taken seriously on your blog, show that you are good friends with books, that you read then, think about them, share them, and respond to them. It's so easy to do nowadays! Here are ten ways to bring books more powerfully into your digital life:

Agile Software Development as a Metaphor

An agile ninja

Software development has undergone a revolution in the last decade.  The formal software engineering method consists of the following steps:
  • Requirements: figuring out what the customer wants
  • Design: specifying on paper how the software should be built
  • Implementation: writing the software in a programming language
  • Testing: running tests to ensure the software acts as specified
  • Documentation: explaining how to use the software
  • Deployment: selling or installing the software
  • Maintenance: providing bug fixes as necessary
In the classic waterfall model, these steps are done one at a time.  Each step can be quite lengthy, and the final product is usually not ready until months have passed.  In particular, the requirements are considered to be almost a contract -- they cannot be changed once they have been formalized, because the design depends upon them.  Changing requirements means restarting the software design process.  The waterfall model is analogous to a samurai -- fighting is only allowed under strict rules that govern the fighter's code of conduct.

Monday, September 27, 2010

Nine Women Can't Make A Baby In One Month

This is a famous quote from Fred Brooks in his classic book, The Mythical Man Month.  Brooks uses this as a drastic example of a phenomenon that has come to be known as Brooks' Law:
Adding manpower to a late software project makes it later
Brooks coined this maxim while working as a software engineer at IBM during the 1950s and 60s.  He noticed that large software projects tended to run behind schedule, and that a manager's first instinct was to add more manpower.  After all, this works in many other fields.  For example, if you need to pick a crop of peaches, the more workers you employ, the faster it will get done.  John Steinbeck's The Grapes of Wrath describes how the perfect partitioning of this kind of manual labor during the Dust Bowl of the 1930s led to the exploitation of workers, movingly illustrating the ugly side of capitalism.  Brooks explains how software engineering is rarely partitioned perfectly into smaller increments of tasks, so that adding more workers actually delays a late project even more.  Each additional worker requires "ramp up" time to become integrated into the project, plus additional personnel and communication costs, factors that managers of the era were ignoring.

Saturday, September 25, 2010

Traditional vs. Digital Economies

One of the most profound changes in digital civilization is the emergence of economies that do not play by the traditional capitalistic rules that the West had taken for granted since the time of Adam Smith and the Industrial Revolution. I will briefly introduce and contrast chief principles of the market economy (drawing on Adam Smith) and those of the emerging economies in the digital world (by referring to several key concepts and their proponents).

Thursday, September 23, 2010

Digital Literacy Overview in Prezi Form

I'm glad to see many in our class playing with Prezi (as Andrew did recently). The visual nature of Prezi is far superior to PowerPoint. It is not tied down to sequence as PowerPoint is (one can pan around, zoom in and out, interrupt following a set path, etc.). It is more open to uses of space, movement, size, and orientation, too. For me, it isn't so much a presentation tool as it is a thinking tool. I reconceive of my topic while struggling to represent it using the formal features that Prezi favors.

Here's a first draft of a presentation about digital literacy that I put into Prezi form (for a presentation I'm giving to some English majors this week). What do you think? Part of what helped me here was recasting each of the three C's (consume-create-connect) into a set of questions. The visuals set up some parallelisms and symbolic relationships that I liked. I'm interested in your feedback before I take it further. Oh, and I've also set it up as public and licensed for re-use. Take a copy of it and revise, remix, or rework it on your own, if you like.



Wednesday, September 22, 2010

Science is Messy

In "The History of the Royal Society of London, for the Improving of Natural Knowledge", Thomas Sprat wrote:
The society has reduced its principal observations into one common-stock and laid them up in public registers to be nakedly transmitted to the next generation of men, and so from them to their successors. And as their purpose was to heap up a mixed mass of experiments, without digesting them into any perfect model, so to this end, they confined themselves to no order of subjects; and whatever they have recorded, they have done it, not as complete schemes of opinions, but as bare, unfinished histories.
What a fantastic metaphor for the open pursuit of science -- a mixed mass of experiments, unfinished work, heaped up into a common place to be nakedly transmitted from one generation to the next!

Monday, September 20, 2010

Algorithms and Truth

Fixing a broken lamp
In an earlier post, I introduced the concept of algorithms as a way of solving problems using a step-by-step process.  Most of us understand algorithmic thinking when it is applied to everyday tasks.  Fixing a broken lamp is a good example of an algorithm, because it involves taking different paths, depending on the outcome of several questions.  A recipe is a simple form of an algorithm:
Sausage and Peppers
1 package of Italian sausage (mild or hot, your choice)
2 bell peppers (red, orange, yellow, green, your choice), sliced
1 sweet onion, sliced
1 Tbsp minced garlic
fresh basil (or dried)
2 Tbsp olive oil
1 28 oz can whole tomatoes
1 package angel hair or penne pasta

1. Cook the sausage whole, on medium heat, until browned, using 1 Tbsp of olive oil. Alternatively, broil the sausage in the oven.  Slice into pieces and reserve
2. In the same pan, cook the bell peppers, onion over medium heat using 1 Tbsp of olive oil, until onion is lightly browned.
3. Add the garlic to the pan and simmer until slightly browned.
4. Add the sausage back to the pan.
5. Puree the tomatoes by pulsing, so that there are still small chunks.
6. Add the tomatoes to the pan, along with the basil.
7. Simmer the sauce on medium-low until reduced and thickened, about 10 or 15 minutes.
8. Serve over angel hair or penne pasta, cooked to package instructions for al dente.

Sunday, September 19, 2010

Ten Ways Out of the Google or Wikipedia Rut

Do you default to using Google or Wikipedia for just about every online search? Those are great resources, but not the only gateways to research and learning online. In fact, there may be more efficient or interesting ways into your subjects.

This post includes 10 great starting places for researching topics online without resorting either to Google or Wikipedia.

I'm going to walk you through each of these as I research the history of science and the scientific method. Watch how much fun I have in diversifying my online discovery methods. Hope you'll try some of these!

Saturday, September 18, 2010

Open Source Science

I attended an interesting talk on by Daniel Lopresti on a new approach to machine perception at the BYU Computer Science colloquium on Thursday.  Machine perception refers to the ability of computers to mimic human behavior for tasks such as computer vision, document analysis, image processing, speech recognition, and natural language understanding.  Dr. Lopresti is advocating many approaches that we have discussed as part of the free software movement:
  • open, shared resources: the research community shares data, algorithms, citations, and other work
  • crowd intelligence: people can rate the quality of the resources, so that the community develops an interpretation of which are the best
  • transparency: algorithms and results are publicly available so they can be modified and improved by other researchers
As a side benefit, results are verifiable and repeatable.  Beginning researchers can build off of existing work more easily, instead of starting from scratch.

Essentially, this idea does away with the status quo of research in many fields, where each researcher works independently, rarely shares algorithms, doesn't always share data, and runs tests that are limited and not easily reproducible.

by NASA's Marshall Space Flight Center
Scientific research seems like the perfect match for openness and transparency. Science is often done for purely altruistic reasons -- to simply advance the truth and knowledge.  The complicating factors are that (1) corporations want to patent their research to monopolize it for themselves, and (2) academics want to keep their data and algorithms private for as long as possible, in order to publish more papers.  Open source science is a big dream, but we haven't yet figured out how to balance these concerns with the benefits that an open source approach would provide.

Wednesday, September 15, 2010

Empire Building on the Internet

So you want to build yourself an empire on the Internet.  You have a great idea for a new product, you scrape together some money, get a web site built, and open your first Internet storefront.  Time to conquer the competition.  Things start off well.  Your site makes a splash, people love your product, and they start coming in droves.  One problem: your site can't handle the load.

Waiting time for a queuing system (Brian Tung)
You've just encountered the harsh realities of queuing theory. Imagine you own a bakery, and you employ one clerk to staff the store.  The clerk is fairly efficient and takes about a minute to handle a customer.  Some customers will take longer, because they have a special order.  If two customers arrive within 30 seconds of each other, clearly the second will have to wait until the first one is finished.  If customers keep arriving quickly, a line will form.  Queuing theory says that if customers start arriving close to the capacity of the store (1 customer per minute), delays will quickly grow and waiting time will become infinite.  The remarkable thing is that this happens before the store reaches its capacity, because some customers take longer to service than one minute, and because sometimes customers arrive sooner than the average.  The same situation holds whether you run a bakery, a web server, or a highway.  (See Brian Tung's excellent blog entry explaining why this happens for freeway traffic.)

Social Bookmarking - The Diigo Digital Civilization Group

Of all the digital tools I have learned, I think social bookmarking to be among the most regularly useful. It has brought order to all the constant surfing and research I do online. It's not that hard to learn, and you'll end up getting more mileage out of your online work.

Dr. Zappala uses the delicio.us bookmarking service, which is great. I have been using Diigo. We've decided to require students to learn and use Diigo because it is well suited for academic purposes. I'm glad that Kevin has posted about Diigo and has an account going already. We'd like you to get an account and then join the Digital Civilization bookmarking group I've set up on Diigo. Read on for more of an explanation and instructions on getting started.

Monday, September 13, 2010

Electronic Freedom

As a modern analogue to the Protestant Reformation, I would like to introduce several important organizations that fight for electronic freedoms.

The free software movement was started in 1983 by Richard Stallman with the foundation of the GNU Operating System project.  The goal was to create an operating system using only free software, where free is defined using four principles:
  • The freedom to run the program, for any purpose (freedom 0).
  • The freedom to study how the program works, and change it to make it do what you wish (freedom 1). Access to the source code is a precondition for this.
  • The freedom to redistribute copies so you can help your neighbor (freedom 2).
  • The freedom to distribute copies of your modified versions to others (freedom 3). By doing this you can give the whole community a chance to benefit from your changes. Access to the source code is a precondition for this.

Background for The Cathedral and the Bazaar

Eric Raymond
One of the assignments for this week is to listen to a speech by Eric Raymond on the subject of his famous essay, The Cathedral and the Bazaar.  It is unclear when he gave this speech, but it was possibly in 1997 at the Linux Kongress conference.

In this speech, Raymond talks about his insights into a programming philosophy he dubs "the bazaar".  His thoughts draw heavily from observations of the Linux operating system development process, which is coordinated by Linus Torvalds.  Linux is free software, written entirely by volunteers, and may be modified and redistributed by anyone.  While Linux has not had much of an impact on the desktop market, it runs about 20 to 40% of the servers on the Internet, and is increasingly being used to run cellular phones.  The Android phone system developed by Google is based on Linux, for example.

Saturday, September 11, 2010

The Legacy of Manutius

I've blogged a lot about computer technology lately, discussing algorithmic thinking, programming languages, and metadata.   I want to take some time to tie these concepts together with the history we've been studying lately.

Aldus Manutius, the great Renaissance publisher, is well known for his preservation of Greek, Latin, and Italian texts, as well as his innovation in bringing these books to the general public in a small, portable format known as an octavo.  The modern analogue to his efforts is Project Gutenberg, which is digitizing as many books as it can and providing them for free to the public.  The latest count includes more than 33,000 free electronic books.  In many ways, this project is fulfilling Manutius' dream beyond his wildest expectations, due to the sheer volume of books being made available and the vast number of readers.  Of course, Manutius could not have forseen the digital age, when copies have become nearly free.  Nor may he have forseen an era when volunteers would donate their time and resources to provide such a large digital library.

Renaissance, Reformation, and China

It's been such a pleasure to discover a series published by Oxford of very short introductions (to historical periods, famous people, and various -isms). Using my Amazon Prime account (which I love and students can get for free for a year) in two days and for $9 I had in my hands Jerry Brotton's The Renaissance: A Very Short Introduction. In 125 brief pages, it gives a great overview of this period.

The image here is Raphael's fresco of the Donation of Constantine. There's Constantine, formally conveying secular authority of the Roman Empire to the Catholic Church.  I spoke about that document in the Digital Civilization class -- how Lorenzo Valla discredited it through linguistic analysis and proved it to be of medieval origins. What I didn't know

Wednesday, September 8, 2010

Our Mormon Renaissance

Inspired by James Wilcox's post, "The Mormons are Renaissance Humanist" and Jeffrey Whitlock's "Humanism from a Latter-day Saint Perspective," I thought it was a fit occasion to make some parallels between the Renaissance and the predominant religion of those in this Digital Civilization course at BYU.

A few years ago I presented a paper at the Association for Mormon Letters called, "Our Mormon Renaissance." It has to do with early and ongoing aspirations of Latter-day Saints to achieve the cultural greatness largely identified with the fertile period of the Renaissance.  Hope you enjoy it.

Data and MetaData

bits, by sciascia on flickr
In some ways, the digital revolution is all about data -- the photos, videos, and web pages we view and share. Data is simply a series of zeros and ones, stored together in a file.  Each zero or one is a bit and eight of these bits together is a byte.  The computer assigns meaning to each bit or byte, depending on the type of the file.  For example, in an image, a byte might represent one of 256 different colors for a pixel or one dot in the image.  In other images, a pixel might be represented by 24 bits (three bytes), allowing for over 3 million different colors for each pixel.  In a text file, each byte can represent a character; the ASCII system maps each of the byte values to a character in the English language.

Campagna autunnale vicino Linguaglossa, by alfiogreen
Metadata is data about the data.  It describes what the data means, and makes it easier for us and for computers to categorize the vast amounts of data we share on the Internet.  It's what makes data truly useful.  The photo sharing site Flickr, for example, allows users to tag photos with key words describing the image.  If I was looking for photos of Linguaglossa, the town in Sicily where my ancestors are from, a simple search will find many beautiful pictures.  In fact, your digital camera usually stores metadata about each photo right in the image, describing the camera settings used to take the photo, the date, and other useful information.  Many web pages include metadata describing the content so that search engines can index it more accurately.

Digital libraries use metadata standards to markup resources for cataloging purposes.  For example, metadata for an author might look like this:
<name type="personal">
    <namePart>Bradbury, Ray</namePart>
    <role>
      <roleTerm type="text">creator</roleTerm>
    </role>
</name>
This uses a format called XML, which encodes metadata in a human-readable form.  In this case, we can see the author's name, Ray Bradbury, and his role as the creator of the work being cataloged.  XML plays a critical role in data exchange on the Internet; it allows data to be extracted in a format that describes it structure, so that computer programs can automatically translate it in meaningful ways.  For example, the RSS and Atom formats are used by blogs to publish a list of posts, so that they are easily read by software such as Google Reader.  The RSS format for the books added to Project Gutenberg can be found at http://www.gutenberg.org/feeds/today.rss.  You can see metadata in action on a digital library by searching the metadata for Project Gutenberg using the Anacleto search engine.

The First Programming Language

Jacquard Loom Punch Cards, by Lars Olaussen
The first programs were written for a loom designed by Joseph Marie Jacquard (1752-1834).  The programs consisted of a series of holes punched into a card.  The machine reads each row of holes, corresponding to one row of thread in the design being woven, and uses the pattern to determine which hooks should be used for that row.  It's an ingenious system, and one that has been used for hundreds of years in the weaving industry.  Variations on these punched cards were used in early digital computers in the 20th century.

Punched cards such as these bear a close resemblance to what we today consider "machine code".  They are in essence a sequence of instructions in a language that can be directly executed by a machine.  They're not terribly easy for humans to understand.  Modern computer programming uses higher-level languages, which are then translated into machine code.  An example of a higher-level language is the Logo language I discussed earlier.

Wired Magazine has an interesting story on the Jacquard loom and its place in history.

Update: Here's a link to a BBC story I had been trying to find earlier about the lace industry in UK and their use  of Jacquard looms to this day.

Programming as Problem Solving

The programming process
I highly recommend the book Computers Ltd.: What They Really Can't Do as a good introduction to the concepts of computability and complexity from a non-technical perspective.  The figure at right is inspired by a diagram the book uses to introduce the concepts of programming and programming languages.

When a programmer has an idea for developing a new piece of software, the first thing he does is express this idea as a algorithm.  An algorithm is a method for solving a problem in a step-by-step fashion.  The next step is to implement this algorithm in a high-level programming language; this is a language designed to make it easy for a human to tell the computer what he wants it to do.  The programmer then uses a special program called a compiler to translate this language into machine code, which is a series of instructions the computer can understand.  The result is a software program that can be run on a machine, such as a laptop or a smart phone.

From this description, you can see programming is a two-step process, of first designing an algorithm and then writing it in a programming language.  Algorithm design is usually the hardest part; computer scientists often refer to the need to train students to think algorithmically.  A good definition of what we mean by this is found in Developing Algorithmic Thinking with Alice, a paper by Cooper et al. that was published in the 2000 Information Systems Education Conference:
1) state a problem clearly, 2) break the problem down into a number of well-defined smaller problems, and 3) devise a step-by-step solution to solve each of the sub-tasks
This is an extremely important skill that transfers over into many disciplines, and the reason why we believe most students can benefit from taking at least one programming class sometime in their education.

Finding Blogs to Read

Blogs are among the most important content sources today and finding and following appropriate blogs is a key part of the "Consume" portion of digital literacy. Here are some ideas for discovering blogs to follow. First, it's important to understand some major divisions among types of blogs.

There are "big" blogs (blogs associated with conventional, big media, like the New York Times blog, or blogs that are very popular and whose posts get tens or hundreds of comments). These big blogs are good to know about to stay current on things, but not as good with respect to communicating with others. They help you get the pulse on things, but they aren't strong with connecting you to communities or individuals. It is most often the smaller blogs where you can have personal interaction with the blog author. So look for those as much as for some of the big-content or top blogs.

Blog Search
This is an obvious starting point, but perhaps not always the most direct or efficient way of finding blogs to follow. But give it a try. The two most important blog searching services are Technorati and Google Blog Search.

Both of these services include mini, general directories to blogs you can browse. Technorati also has an authority ranking for blogs, which is a very interesting concept. Both services also list top blogs and categorize these. (And both index blogs generally, not just big ones)


Top Blogs

Technorati maintains a well known list of top blogs, the "Technorati Top 100." This list has a heavy emphasis on news and politics (Huffington Post; CNN, Politics Daily, etc.) and especially technology blogs. Given the focus of the Digital Civilization course, some of the tech blogs are certainly worth following. I particularly recommend Read/Write/Web. Unlike some of the gadget-oriented blogs, it deals with larger principles and social trends, the effects of technology. Mashable is very current for Web2.0 and social media.

Blog Directories
Browse the categories in such blog directories as blogcatalog or blogged. There are also directories maintained by the major blog platforms. You might visit the Typepad Featured Blogs site. I find these directories have a more diverse set of categories (often more appropriate for academic purposes). If you are interested in connecting with LDS blogs (the "Bloggernacle"), you should consult the Mormon Archipelago or LDS Blogs directory.

Education Blogs
Especially appropriate for students are the many blogs hosted by teachers, educators, professors, students, and people in the ed-tech community. A good starting place for these is blogged, but the best is probably EduBlogs.

Institutional Blogs
Governments, universities, libraries, museums -- they are all getting into blogging as a way of communicating with constituents and featuring content. Try out the Library of Congress Blog, or any of the Smithsonian's blogs for starters.

Finding Blogs through "Social Discovery"
However, the best way I've found to discover blogs is indirectly, looking not so much for topics as for people, and by associative searching (by seeing who follows whom online). This is part of "social discovery," a key concept in digital literacy that we will be talking a lot about.

Essentially, what this means is that if you are involved in a social network (Facebook, Twitter, Goodreads, etc.) you click on your friends and followers and see if they have a profile or a link to their own personal blogs. Or, if you find a particular site or blog that fits your interests, and it has the followers feature enabled, you can usually find great blogs by seeing who else has shown enough interest in the topic of the blog to register as an official follower, then click through to see if they have a blog. This works great through Twitter, too. (See my brief video tutorial on this here).

If you are using Google Reader or Google Buzz, these also have follower features enabled and you can go see the profiles and websites of those that are following you. For example, take a look at my own Google profile. You can see the many places that I show up online. Indeed, almost any of the media content sites now have following/ friending features and some sort of profile that can lead you to discover interesting people and their sites and blogs. I've found people through SlideShare, for example, a site devoted to posting and sharing PowerPoint presentations. You can do the same via Flickr (for photos) or Goodreads (for books).

Have you looked up the people that follow you on your blog? They just might be producing content you want to channel your way.

Sunday, September 5, 2010

Without Apology

Dr. Zappala and I have thrown a lot at our Digital Civilization class: new tools, and a radically new approach to content, independent learning, and connecting students to their own passions, to other students, and to things bigger and broader than a semester and a Gen Ed requirement.

But no apologies! Sure, it's a lot, with plenty of kinks to work out. We'll get some things wrong. But we know where we are going, and that direction is not backwards to the status quo of higher education.

Speaking of which, the following video expresses our sentiments. Only, unlike the professor at the beginning, we will not shrug and say we have failed our students. We are giving this experiment all we've got.




Too many teachers, courses, and colleges proceed apace as though no radical revolution is underway in our society, This is not our point of view. We will not be teaching this course in the traditional way with a little bit of audiovisual enhancement to dress it up. No. We are challenging our students in a serious way precisely because the stakes are so high and the attention given to the change is so low -- even at first-rate universities like the one we teach at.

It isn't just that technology is increasing and media multiplying. All our institutions are being reformulated -- not just retooled -- as the revolution takes hold in how we communicate, think, solve problems, collaborate, persuade, work -- in how we conceive of the world and act meaningfully within it: government, business, family life, art -- the works.

It may appear to our students that their professors are a couple of geeks imposing their love of things technical upon their students. If so, they have missed the point. It is not about the tools, nor some naive attachment to gadgets and science fiction. It is about the principles upon which society is built (or rebuilt); it is about a lifetime of purposeful, educated, passionate involvement in the life of the mind and the lives of our neighbors across the planet. It is about the very purpose of an education. It is about realizing how to realize your potential in a world in which print literacy will no longer dominate. It's about catching up, yes, but it is more about catching the vision.

We have that vision, and it thrills us. And it scares us a bit, to tell you the truth. But we want to take our students with us, forward in the future. We want them to be brave enough to detach from the comfort and familiarity of  textbook learning and to pick up the challenge to dig deep within to the taproots of their passions, and to reach beyond themselves and their classrooms to the social networks and authentic issues and problems to which they need not wait to begin contributing their talents.

For all of this, we don't apologize. We hope our students are just crazy enough to stay with us for the ride.

Saturday, September 4, 2010

Civilizing Us Digitally

Do our technological tools lead us toward or away from being civilized?

Daniel's recent post discussed simulations, through which parts of the world are modeled by computers in order to ask important questions about physical phenomena like weather or oil spills. In a way, this is how you can think about history or about literature. Stories (both the true and fictional sorts) are simulations. When we read Thomas More's Utopia, it models social phenomena, which might be as vital as those physically-oriented simulations Daniel mentioned.  Kevin Watson rightly pointed out that Utopia was not a place without problems (Andrew DeWitt noted that slavery was troublingly part of More's ideal place). Fine. The problems within a fictional utopia yield useful information for the real world. We can appreciate history and literature as a set of experiments from the past to color the present.

Our digital world is a richly experimental world, combining the data-driven world of science with various social sandboxes. Video games are often condemned as the end of civilization (or at least the end of reading). What about Sid Meier's famous computer game, Civilization? You start in 4000 B.C. and attempt to engineer a society to stand the test of time.


Video games are called antisocial, but if you are learning the various dynamic factors that influence the rise of nations across centuries, isn't that a kind of valuable knowledge? Maybe this sort of gaming could be the sort of "civic media" described in Dalton Haslam's recent blog post:
the use of media and information to help society function in the the way that it should. It helps foster democratic ideals and leads to greater awareness as citizens
Media advancing democratic ideals, greasing the wheels of society. Sounds like technology and media are definitely tools for advancing civilization. It's true that advances in communications technology have generally meant greater participation from more people within their societies. This is where social idealism and technological utopianism combine: if we can only get a laptop for every child in the world, we'll soon have global democracy (and the end of poverty, etc.).

Studying a bit of Thomas More's Utopia is a good place to begin our thinking through digital civilization. As Mike Lemon points out in his recent post, the narrator of Utopia speaks admiringly of Utopians learning from the Greeks, yet the constant ironic tone puts this admiration in question. As Mike says, "Is he truly admiring the idea of information dissemination, or is he poking fun at the current trend of rediscovery"? Maybe Thomas More was playing with readers the way we might play with Sid Meier's game, Civilization.

Part of Renaissance thinking was devoted to idealizing (especially over ancient civilizations), and part of it was devoted to skepticism. That's why you get something like More's Utopia -- literally meaning a place that is "not" (u-topia) and a place that is good (eu-topia).  That's probably a healthy place to be during this Digital Renaissance of today.

Civilization itself is an ideal, going back to Plato's Republic or to St. Augustine's City of God. But in contrast to what? The root word of civilization is the Latin civis, or "city." Are cities inherently ideal -- as opposed to rural places? That's very problematic. As writers like Thoreau and Wordsworth (or environmentalists like John Muir, Rachel Carson, or Edward Abbey) have illustrated so well, our connection with earthy, sky, and land may be more profoundly meaningful than the artifice and pollutions of city life. Are we aspiring to a dystopia by privileging "civilization"?

Nepalese women using One Laptop computers on the way to Mt. Everest base camp
And worse, by focusing on DIGITAL civilization, are we only distancing ourselves further from the authenticity of our physical environments and from friends and family? We've all known the scenario of technology sucking hours away, isolating us as we hover over screens, ignoring the flesh-and-blood people that are nearby. Is technology civilizing us or isolating us?

I've been interested by Kristen Cardon's musings about technology and education in her blog. She's studying Tibetans and technology, and she is asking some tough questions:
Why do we, along with Tibetans see a holy grail in classroom technology? Given that some technologies actually improve the classroom, which ones detract?
I honestly don't know how to answer those hard questions. Even as we construct this class, Dr. Zappala and I anguish over whether we are focusing too much on means (digital tools) and not enough on ends (course content).

Let's up the ante even more. Maybe we're just gearing up for oppression through our technology in the classroom (either being oppressed or being oppressors). One of my recent students, Allison Frost, used her research blog to study the ways that China is truly ramping up in the digital age into Big Brother -- the overlords of George Orwell's 1984. At first I thought this was a simplistic analogy, but her research was very convincing. In the hands of those who wish to monitor and micromanage, technology is a powerful agency.

And if you look at how gambling, pornography, terrorism -- or even just spam, uncivil discourse, bullying, and widespread idiocy -- have spread rampantly through technology, it's easy to see see digital civilization as an oxymoron. We've invented the means to amplify our own worse tendencies to the point of moral and physical self-destruction.

Photos: 1) flickr - graye; 2) flickr - One Laptop Per Child

Friday, September 3, 2010

Voyages of Discovery

A quick follow-up on my earlier post regarding programming and poetry.  An old friend, Jonas Karlsson, commented to me via Facebook:
Alas, we are still bound by the constraints of hardware, interface design, and computational complexity (and not to mention our own capabilities). We are not so much god-like creators, as explorers of the possibility-space.
This is a great counterpoint -- I went a little too far in stretching the analogy between programming and creation.  I think Jonas is right that programming is more like exploring.  Like Magellan or Columbus, we have limitations we must work around, so the space we explore is not infinite, though it appears exceedingly large.  There are so many things we can learn.

One of the ways computing explores the possibility-space is through simulation, in which a part of the world, or perhaps an imaginary world, is modeled and analyzed to answer questions about what might happen in a range of scenarios.  Simulations can be used to explore population genetics, predator-prey relationships, the weather, and many other important scientific and engineering topics. 

Shortly after the Deepwater Horizon oil spill, UCAR created a simulation showing how the oil could spread, based on models of ocean currents:



Did the Age of Exploration ever end, or did we just move on to new frontiers?

Elementary Programming

Since we'll be talking about computing so much, it might be helpful to have a basic understanding of how programming works.  A simple way to get a little exposure is to use a variant of the Logo programming language.  I'll show you a few programs using KTurtle, which is part of the KDE Education Project.

KTurtle supports a very limited set of Logo known informally as turtle graphics.  The programmer controls a turtle with a pen.  The most basic commands tell the turtle to pick up the pen, put down the pen, move forward, and turn.  Believe it or not, you can draw some complex patterns with these simple commands.

Here's the first program:
repeat 5 {
    forward 100
    turnright 72
}
This tells the turtle to move forward 100 spaces and turn right 72°.  This sequence is repeated five times, resulting in this drawing:


Now put all of this in another repeat loop:

repeat 10 {
    repeat 5 {
        forward 100
        turnright 72
    }
    forward 1
    turnright 36
}
This draws the pentagon 10 times.  Each time the pentagon is drawn, the turtle moves forward 1 space and turns right 36°, resulting in this pattern:


Notice that if the product of the first repeat value and the last turnright value equal 360, the turtle will move in a circle as it completes the drawing.  The pattern above was done with factors of 10 and 36.  Here's what happens if we reverse the factors, turning 10° and repeating 36 times:


And here's the pattern that results from turning only 1° and repeating 360 times:


Looks a little like HAL 9000, doesn't it?

Now, what happens if you want the turtle to turn a different amount each time through a loop?  This is where you will need to use a variable to store the current amount you would like the turtle to turn:
repeat 4 {
    for $x = 1 to 100 {
        forward 10
        turnright $x
    }
    turnright 80
}
The for statement tells the turtle to set the variable $x to 1, move forward 10 steps, turn right x°, then repeat the loop with $x set to 2.  This continues for every value from 1 to 100.

These steps are themselves inside of a while loop that repeats the drawing four times, turning right 80° after each iteration, to produce this figure:


You can do a lot more than this with most programming languages, but this hopefully this gives you a general idea of what programming is like.  The KTurtle manual provides a complete description of the commands available.

If this looks like fun, you might try the Scratch software developed by MIT to provide a visual way of creating a much greater variety of programs.

Thursday, September 2, 2010

Programmer or Poet?

In his book The Mythical Man-Month, Fred Brooks shares this insight about programmers:
The programmer, like the poet, works only slightly removed from pure thought-stuff. He builds his castles in the air, from air, creating by exertion of the imagination. Few media of creation are so flexible, so easy to polish and rework, so readily capable of realizing grand conceptual structures.
Few would expect a man with a Ph.D. in Math, who is best known for his work writing software at IBM, to wax so so eloquently about the process of writing software.  Yet this quote captures precisely the lure of programming for many who practice its art -- the idea that we can build whatever world we want, following whatever rules we imagine, and populated with creatures of our own design.  What a heady experience!  Writing a program provides a kind of thrill that is hard to replicate.  I built this.  It is all mine and I am its God.  An engineer who designs bridges is limited to the rules of nature, and must consider the stress on the structure and whether it is safe for people to drive on.  They better take into account the impact of wind on the stability of the bridge:



As my colleague Charles Knutson likes to say, computer scientists have no such limitations -- we can build a bridge that flies!

Leonardo da Vinci chose to create his own world when painting the Last Supper, one in which the apostles are engaged in a variety of activities, rather than staring piously at the viewer.  He set the rules for perspective, he chose how the characters looked and acted.


Compare this to Salvadore Dali's surrealistic rendition, which creates an entirely different world of transparency and arithmetic.


Is a programmer more a scientist or an artist?

In a physical world that is made smaller through modern transportation and communication technologies, virtual worlds continue to offer voyages of discovery.

Computers are catholic and we live in the church of technology

As I was reading Daniel's post about artificial intelligence (and enjoying that clip from 2001: A Space Odyssey), it made me think of how computers don't just puzzle us with conundrums about the nature of thought; they SCARE us. And you don't need that big red eye-ball to get freaked out by our microchip masters. They frighten us not just because of what they can do, their feats of memory and manipulation, but because of the way we have become so dependent upon them. They are so integrated into our lives. They are so universal, so catholic. (The word "catholic" means "universal" and only later became a proper noun naming the Catholic church).


we see through a glass, pixely

Computers are catholic (not Catholic), and that is a scary thing. I make this odd comparison because I want to link computing to the Renaissance and to the ways that those living in that time underwent a radical change in perspective. The culture of Western Europe in the late middle ages was completely dominated by the Catholic church. It was an institution that ordered their lives, and their thinking. (This was not always an oppressive thing as it is sometimes stereotyped as being. The Catholic church was a marvel of efficiency and sustained literacy over the centuries...). 

Side story. I'm directing a play right now, and one of the actors is a producer for BYU Television, and she is looking for a family that they can feature in one of those "let's take your technology away for a week and watch you all go into withdrawal" kinds of things. I told her how wired my family is, how my wife and I sit next to each other in bed tapping at our laptops and sending each other instant messages even though we are seated six inches from one another. That sort of thing. She started salivating, like the Burton clan is perfect for one of these expose-like features condemning the overlords of technology. 

religion orders lives
computing orders our lives
Well, how would you do if the BYU TV people came to you and asked you to park your phones, your computers, and stay away from all screens for a week or more? That's right, you'd get the shakes and start foaming at the mouth. I'm doing that right now just thinking about such a media fast. (Part of me secretly wants to have that media fast, I confess...)

What's my point? Substitute our world-o-tech for the world of Catholicism at the end of the middle ages. Like that religious institution, computing and technology have become institutionalized in our lives. It gives us order, routine, and an orientation to our time and our perspectives. I'm liking this comparison more and more. Computers and technology are the modern day counterparts to medieval Catholicism. Heck, I think some people are more devoted to their cell phones than generations of Catholics were to their saints or rituals.

So, along comes some people who challenge the basic premises of Catholicism. People like Copernicus and Galileo, who have the temerity to say that the map of the cosmos is drawn wrong. All wrong. Earth isn't in the center; the sun is. You just cant take away that Ptolemaic worldview from people cold turkey. It's like the teenager who text messages in her sleep or under the table. Take her cell phone away and what do you have? Someone who reads and knits for the poor? No, you have a freaked out teenager who is going to snap at you like a junkyard dog.

I hope you see where I'm going with this. Today we are so immersed in change and in one new world being opened up to us after another, we forget how hard it is to shift gears from a very stable, constant, and reliable worldview to another one.

Computing and tech are a worldview. We mostly like it, but like all worldviews, we don't sense how much it controls our thinking and feeling unless it is challenged.

So who are the Galileos of today? Is anyone challenging the status quo?

Andrew Keen is. And he is doing so in terms of another concept from the Renaissance: utopianism. He says Web2.0 is a technological utopianism that just doesn't cut it. We've been trapped in the romance of the new tools, but losing sight of what real knowledge or creativity is all about.

Why would anyone bother challenging our progressive computing world? What worldview is there to supplant or transcend it?

Photos: 1) flickr-rogiro; 2) flickr-riNux; 3) flickr-SimSullen

Wednesday, September 1, 2010

Artificial Intelligence and Search

We talked in class yesterday about Artificial Intelligence -- a branch of Computer Science that tries to create machines with human intelligence.  Here is the clip we watched from 2001: A Space Odyssey:



Notice how human HAL is in this clip.  He displays a passive-aggressive tendency, refusing to answer Dave's question for almost a full minute, as Dave gets increasingly impatient.  When HAL finally answers, he uses a calm voice as if nothing is wrong.  He tries to avoid conflict; when Dave asks him what the problem is, his response is one of my favorite lines in the movie: "I think you know what the problem is as well as I do."  HAL's last line of this clip is actually dismissive: "Dave, this conversation can serve no purpose any more."

One of the interesting aspects of the movie is whether HAL is displaying true intelligence or if he's just doing what he was programmed to do.  HAL's programmed mission is to discover evidence of life on Jupiter's moon Tycho.  He believes the crew is jeopardizing the mission because they discuss disconnecting him due to an error he committed, so he decides to kill them.  Is this evidence of intelligence, or just a logical outcome of his programming, based on the instructions he was given?  Computers are very good at doing what they are told to do.

This is the quandary of Artificial Intelligence -- is it even possible to make computers intelligent the way humans are?  Or will they only ever complete tasks that are "complex", acting as a mediator between humans and the world?  If a computer manages to learn something, is it only because the true intelligence is in the algorithm that humans embedded into the computer?  Can a computer adapt and learn things we haven't trained it for?  Can it go beyond its programming?

A good way of pondering these questions is to consider the state of what computers can do now.  Consider Google.  We expect that we can enter any search terms we want on Google's site, and it will instantly find relevant web pages for us.  This seems intelligent.  Google is not just blindly returning the thousands of pages that match our query, it is also having to make choices about which pages are most relevant.  Google is certainly doing something we would find difficult to do on our own, and it appears to be doing some high-level reasoning about which pages are best.

Watch this video about how Google's search engine works:



Now what do you think?  Does seeing what makes Google work change your idea of whether it is intelligent?  Is it just following a prescribed set of steps that it was told to do?  Is the real intelligence in the people who thought up this brilliant idea?  Yet, if we showed this technology to someone 20 or 30 years in the past, they would certainly be impressed by how far computer intelligence has come.  If Google doesn't yet meet your definition of intelligence, perhaps we are getting closer to cracking one of Computer Science's biggest challenges.  Maybe artificial intelligence isn't unobtainable.

As you pursue your own learning in this class, keep HAL and Google in mind.  Would your learning be possible without computer technology?  Would it be less efficient?  At what point does computing become so powerful, that the distinction between mediation and intelligence gets blurred?