Showing posts with label computers. Show all posts
Showing posts with label computers. Show all posts

Wednesday, September 15, 2010

Empire Building on the Internet

So you want to build yourself an empire on the Internet.  You have a great idea for a new product, you scrape together some money, get a web site built, and open your first Internet storefront.  Time to conquer the competition.  Things start off well.  Your site makes a splash, people love your product, and they start coming in droves.  One problem: your site can't handle the load.

Waiting time for a queuing system (Brian Tung)
You've just encountered the harsh realities of queuing theory. Imagine you own a bakery, and you employ one clerk to staff the store.  The clerk is fairly efficient and takes about a minute to handle a customer.  Some customers will take longer, because they have a special order.  If two customers arrive within 30 seconds of each other, clearly the second will have to wait until the first one is finished.  If customers keep arriving quickly, a line will form.  Queuing theory says that if customers start arriving close to the capacity of the store (1 customer per minute), delays will quickly grow and waiting time will become infinite.  The remarkable thing is that this happens before the store reaches its capacity, because some customers take longer to service than one minute, and because sometimes customers arrive sooner than the average.  The same situation holds whether you run a bakery, a web server, or a highway.  (See Brian Tung's excellent blog entry explaining why this happens for freeway traffic.)

Saturday, September 11, 2010

The Legacy of Manutius

I've blogged a lot about computer technology lately, discussing algorithmic thinking, programming languages, and metadata.   I want to take some time to tie these concepts together with the history we've been studying lately.

Aldus Manutius, the great Renaissance publisher, is well known for his preservation of Greek, Latin, and Italian texts, as well as his innovation in bringing these books to the general public in a small, portable format known as an octavo.  The modern analogue to his efforts is Project Gutenberg, which is digitizing as many books as it can and providing them for free to the public.  The latest count includes more than 33,000 free electronic books.  In many ways, this project is fulfilling Manutius' dream beyond his wildest expectations, due to the sheer volume of books being made available and the vast number of readers.  Of course, Manutius could not have forseen the digital age, when copies have become nearly free.  Nor may he have forseen an era when volunteers would donate their time and resources to provide such a large digital library.

Wednesday, September 8, 2010

Data and MetaData

bits, by sciascia on flickr
In some ways, the digital revolution is all about data -- the photos, videos, and web pages we view and share. Data is simply a series of zeros and ones, stored together in a file.  Each zero or one is a bit and eight of these bits together is a byte.  The computer assigns meaning to each bit or byte, depending on the type of the file.  For example, in an image, a byte might represent one of 256 different colors for a pixel or one dot in the image.  In other images, a pixel might be represented by 24 bits (three bytes), allowing for over 3 million different colors for each pixel.  In a text file, each byte can represent a character; the ASCII system maps each of the byte values to a character in the English language.

Campagna autunnale vicino Linguaglossa, by alfiogreen
Metadata is data about the data.  It describes what the data means, and makes it easier for us and for computers to categorize the vast amounts of data we share on the Internet.  It's what makes data truly useful.  The photo sharing site Flickr, for example, allows users to tag photos with key words describing the image.  If I was looking for photos of Linguaglossa, the town in Sicily where my ancestors are from, a simple search will find many beautiful pictures.  In fact, your digital camera usually stores metadata about each photo right in the image, describing the camera settings used to take the photo, the date, and other useful information.  Many web pages include metadata describing the content so that search engines can index it more accurately.

Digital libraries use metadata standards to markup resources for cataloging purposes.  For example, metadata for an author might look like this:
<name type="personal">
    <namePart>Bradbury, Ray</namePart>
    <role>
      <roleTerm type="text">creator</roleTerm>
    </role>
</name>
This uses a format called XML, which encodes metadata in a human-readable form.  In this case, we can see the author's name, Ray Bradbury, and his role as the creator of the work being cataloged.  XML plays a critical role in data exchange on the Internet; it allows data to be extracted in a format that describes it structure, so that computer programs can automatically translate it in meaningful ways.  For example, the RSS and Atom formats are used by blogs to publish a list of posts, so that they are easily read by software such as Google Reader.  The RSS format for the books added to Project Gutenberg can be found at http://www.gutenberg.org/feeds/today.rss.  You can see metadata in action on a digital library by searching the metadata for Project Gutenberg using the Anacleto search engine.

The First Programming Language

Jacquard Loom Punch Cards, by Lars Olaussen
The first programs were written for a loom designed by Joseph Marie Jacquard (1752-1834).  The programs consisted of a series of holes punched into a card.  The machine reads each row of holes, corresponding to one row of thread in the design being woven, and uses the pattern to determine which hooks should be used for that row.  It's an ingenious system, and one that has been used for hundreds of years in the weaving industry.  Variations on these punched cards were used in early digital computers in the 20th century.

Punched cards such as these bear a close resemblance to what we today consider "machine code".  They are in essence a sequence of instructions in a language that can be directly executed by a machine.  They're not terribly easy for humans to understand.  Modern computer programming uses higher-level languages, which are then translated into machine code.  An example of a higher-level language is the Logo language I discussed earlier.

Wired Magazine has an interesting story on the Jacquard loom and its place in history.

Update: Here's a link to a BBC story I had been trying to find earlier about the lace industry in UK and their use  of Jacquard looms to this day.

Programming as Problem Solving

The programming process
I highly recommend the book Computers Ltd.: What They Really Can't Do as a good introduction to the concepts of computability and complexity from a non-technical perspective.  The figure at right is inspired by a diagram the book uses to introduce the concepts of programming and programming languages.

When a programmer has an idea for developing a new piece of software, the first thing he does is express this idea as a algorithm.  An algorithm is a method for solving a problem in a step-by-step fashion.  The next step is to implement this algorithm in a high-level programming language; this is a language designed to make it easy for a human to tell the computer what he wants it to do.  The programmer then uses a special program called a compiler to translate this language into machine code, which is a series of instructions the computer can understand.  The result is a software program that can be run on a machine, such as a laptop or a smart phone.

From this description, you can see programming is a two-step process, of first designing an algorithm and then writing it in a programming language.  Algorithm design is usually the hardest part; computer scientists often refer to the need to train students to think algorithmically.  A good definition of what we mean by this is found in Developing Algorithmic Thinking with Alice, a paper by Cooper et al. that was published in the 2000 Information Systems Education Conference:
1) state a problem clearly, 2) break the problem down into a number of well-defined smaller problems, and 3) devise a step-by-step solution to solve each of the sub-tasks
This is an extremely important skill that transfers over into many disciplines, and the reason why we believe most students can benefit from taking at least one programming class sometime in their education.

Friday, September 3, 2010

Voyages of Discovery

A quick follow-up on my earlier post regarding programming and poetry.  An old friend, Jonas Karlsson, commented to me via Facebook:
Alas, we are still bound by the constraints of hardware, interface design, and computational complexity (and not to mention our own capabilities). We are not so much god-like creators, as explorers of the possibility-space.
This is a great counterpoint -- I went a little too far in stretching the analogy between programming and creation.  I think Jonas is right that programming is more like exploring.  Like Magellan or Columbus, we have limitations we must work around, so the space we explore is not infinite, though it appears exceedingly large.  There are so many things we can learn.

One of the ways computing explores the possibility-space is through simulation, in which a part of the world, or perhaps an imaginary world, is modeled and analyzed to answer questions about what might happen in a range of scenarios.  Simulations can be used to explore population genetics, predator-prey relationships, the weather, and many other important scientific and engineering topics. 

Shortly after the Deepwater Horizon oil spill, UCAR created a simulation showing how the oil could spread, based on models of ocean currents:



Did the Age of Exploration ever end, or did we just move on to new frontiers?

Elementary Programming

Since we'll be talking about computing so much, it might be helpful to have a basic understanding of how programming works.  A simple way to get a little exposure is to use a variant of the Logo programming language.  I'll show you a few programs using KTurtle, which is part of the KDE Education Project.

KTurtle supports a very limited set of Logo known informally as turtle graphics.  The programmer controls a turtle with a pen.  The most basic commands tell the turtle to pick up the pen, put down the pen, move forward, and turn.  Believe it or not, you can draw some complex patterns with these simple commands.

Here's the first program:
repeat 5 {
    forward 100
    turnright 72
}
This tells the turtle to move forward 100 spaces and turn right 72°.  This sequence is repeated five times, resulting in this drawing:


Now put all of this in another repeat loop:

repeat 10 {
    repeat 5 {
        forward 100
        turnright 72
    }
    forward 1
    turnright 36
}
This draws the pentagon 10 times.  Each time the pentagon is drawn, the turtle moves forward 1 space and turns right 36°, resulting in this pattern:


Notice that if the product of the first repeat value and the last turnright value equal 360, the turtle will move in a circle as it completes the drawing.  The pattern above was done with factors of 10 and 36.  Here's what happens if we reverse the factors, turning 10° and repeating 36 times:


And here's the pattern that results from turning only 1° and repeating 360 times:


Looks a little like HAL 9000, doesn't it?

Now, what happens if you want the turtle to turn a different amount each time through a loop?  This is where you will need to use a variable to store the current amount you would like the turtle to turn:
repeat 4 {
    for $x = 1 to 100 {
        forward 10
        turnright $x
    }
    turnright 80
}
The for statement tells the turtle to set the variable $x to 1, move forward 10 steps, turn right x°, then repeat the loop with $x set to 2.  This continues for every value from 1 to 100.

These steps are themselves inside of a while loop that repeats the drawing four times, turning right 80° after each iteration, to produce this figure:


You can do a lot more than this with most programming languages, but this hopefully this gives you a general idea of what programming is like.  The KTurtle manual provides a complete description of the commands available.

If this looks like fun, you might try the Scratch software developed by MIT to provide a visual way of creating a much greater variety of programs.

Thursday, September 2, 2010

Programmer or Poet?

In his book The Mythical Man-Month, Fred Brooks shares this insight about programmers:
The programmer, like the poet, works only slightly removed from pure thought-stuff. He builds his castles in the air, from air, creating by exertion of the imagination. Few media of creation are so flexible, so easy to polish and rework, so readily capable of realizing grand conceptual structures.
Few would expect a man with a Ph.D. in Math, who is best known for his work writing software at IBM, to wax so so eloquently about the process of writing software.  Yet this quote captures precisely the lure of programming for many who practice its art -- the idea that we can build whatever world we want, following whatever rules we imagine, and populated with creatures of our own design.  What a heady experience!  Writing a program provides a kind of thrill that is hard to replicate.  I built this.  It is all mine and I am its God.  An engineer who designs bridges is limited to the rules of nature, and must consider the stress on the structure and whether it is safe for people to drive on.  They better take into account the impact of wind on the stability of the bridge:



As my colleague Charles Knutson likes to say, computer scientists have no such limitations -- we can build a bridge that flies!

Leonardo da Vinci chose to create his own world when painting the Last Supper, one in which the apostles are engaged in a variety of activities, rather than staring piously at the viewer.  He set the rules for perspective, he chose how the characters looked and acted.


Compare this to Salvadore Dali's surrealistic rendition, which creates an entirely different world of transparency and arithmetic.


Is a programmer more a scientist or an artist?

In a physical world that is made smaller through modern transportation and communication technologies, virtual worlds continue to offer voyages of discovery.

Computers are catholic and we live in the church of technology

As I was reading Daniel's post about artificial intelligence (and enjoying that clip from 2001: A Space Odyssey), it made me think of how computers don't just puzzle us with conundrums about the nature of thought; they SCARE us. And you don't need that big red eye-ball to get freaked out by our microchip masters. They frighten us not just because of what they can do, their feats of memory and manipulation, but because of the way we have become so dependent upon them. They are so integrated into our lives. They are so universal, so catholic. (The word "catholic" means "universal" and only later became a proper noun naming the Catholic church).


we see through a glass, pixely

Computers are catholic (not Catholic), and that is a scary thing. I make this odd comparison because I want to link computing to the Renaissance and to the ways that those living in that time underwent a radical change in perspective. The culture of Western Europe in the late middle ages was completely dominated by the Catholic church. It was an institution that ordered their lives, and their thinking. (This was not always an oppressive thing as it is sometimes stereotyped as being. The Catholic church was a marvel of efficiency and sustained literacy over the centuries...). 

Side story. I'm directing a play right now, and one of the actors is a producer for BYU Television, and she is looking for a family that they can feature in one of those "let's take your technology away for a week and watch you all go into withdrawal" kinds of things. I told her how wired my family is, how my wife and I sit next to each other in bed tapping at our laptops and sending each other instant messages even though we are seated six inches from one another. That sort of thing. She started salivating, like the Burton clan is perfect for one of these expose-like features condemning the overlords of technology. 

religion orders lives
computing orders our lives
Well, how would you do if the BYU TV people came to you and asked you to park your phones, your computers, and stay away from all screens for a week or more? That's right, you'd get the shakes and start foaming at the mouth. I'm doing that right now just thinking about such a media fast. (Part of me secretly wants to have that media fast, I confess...)

What's my point? Substitute our world-o-tech for the world of Catholicism at the end of the middle ages. Like that religious institution, computing and technology have become institutionalized in our lives. It gives us order, routine, and an orientation to our time and our perspectives. I'm liking this comparison more and more. Computers and technology are the modern day counterparts to medieval Catholicism. Heck, I think some people are more devoted to their cell phones than generations of Catholics were to their saints or rituals.

So, along comes some people who challenge the basic premises of Catholicism. People like Copernicus and Galileo, who have the temerity to say that the map of the cosmos is drawn wrong. All wrong. Earth isn't in the center; the sun is. You just cant take away that Ptolemaic worldview from people cold turkey. It's like the teenager who text messages in her sleep or under the table. Take her cell phone away and what do you have? Someone who reads and knits for the poor? No, you have a freaked out teenager who is going to snap at you like a junkyard dog.

I hope you see where I'm going with this. Today we are so immersed in change and in one new world being opened up to us after another, we forget how hard it is to shift gears from a very stable, constant, and reliable worldview to another one.

Computing and tech are a worldview. We mostly like it, but like all worldviews, we don't sense how much it controls our thinking and feeling unless it is challenged.

So who are the Galileos of today? Is anyone challenging the status quo?

Andrew Keen is. And he is doing so in terms of another concept from the Renaissance: utopianism. He says Web2.0 is a technological utopianism that just doesn't cut it. We've been trapped in the romance of the new tools, but losing sight of what real knowledge or creativity is all about.

Why would anyone bother challenging our progressive computing world? What worldview is there to supplant or transcend it?

Photos: 1) flickr-rogiro; 2) flickr-riNux; 3) flickr-SimSullen

Wednesday, September 1, 2010

Artificial Intelligence and Search

We talked in class yesterday about Artificial Intelligence -- a branch of Computer Science that tries to create machines with human intelligence.  Here is the clip we watched from 2001: A Space Odyssey:



Notice how human HAL is in this clip.  He displays a passive-aggressive tendency, refusing to answer Dave's question for almost a full minute, as Dave gets increasingly impatient.  When HAL finally answers, he uses a calm voice as if nothing is wrong.  He tries to avoid conflict; when Dave asks him what the problem is, his response is one of my favorite lines in the movie: "I think you know what the problem is as well as I do."  HAL's last line of this clip is actually dismissive: "Dave, this conversation can serve no purpose any more."

One of the interesting aspects of the movie is whether HAL is displaying true intelligence or if he's just doing what he was programmed to do.  HAL's programmed mission is to discover evidence of life on Jupiter's moon Tycho.  He believes the crew is jeopardizing the mission because they discuss disconnecting him due to an error he committed, so he decides to kill them.  Is this evidence of intelligence, or just a logical outcome of his programming, based on the instructions he was given?  Computers are very good at doing what they are told to do.

This is the quandary of Artificial Intelligence -- is it even possible to make computers intelligent the way humans are?  Or will they only ever complete tasks that are "complex", acting as a mediator between humans and the world?  If a computer manages to learn something, is it only because the true intelligence is in the algorithm that humans embedded into the computer?  Can a computer adapt and learn things we haven't trained it for?  Can it go beyond its programming?

A good way of pondering these questions is to consider the state of what computers can do now.  Consider Google.  We expect that we can enter any search terms we want on Google's site, and it will instantly find relevant web pages for us.  This seems intelligent.  Google is not just blindly returning the thousands of pages that match our query, it is also having to make choices about which pages are most relevant.  Google is certainly doing something we would find difficult to do on our own, and it appears to be doing some high-level reasoning about which pages are best.

Watch this video about how Google's search engine works:



Now what do you think?  Does seeing what makes Google work change your idea of whether it is intelligent?  Is it just following a prescribed set of steps that it was told to do?  Is the real intelligence in the people who thought up this brilliant idea?  Yet, if we showed this technology to someone 20 or 30 years in the past, they would certainly be impressed by how far computer intelligence has come.  If Google doesn't yet meet your definition of intelligence, perhaps we are getting closer to cracking one of Computer Science's biggest challenges.  Maybe artificial intelligence isn't unobtainable.

As you pursue your own learning in this class, keep HAL and Google in mind.  Would your learning be possible without computer technology?  Would it be less efficient?  At what point does computing become so powerful, that the distinction between mediation and intelligence gets blurred?