Steve Jobs

(Photo courtesy of Kevin Berne)

In 2011, Steve Jobs’ death made news around the world. In the spring of 2012, his name was back in the headlines, this time in a theater production. Mike Daisey’s monologue “The Agony and the Ecstasy of Steve Jobs” was heavily criticized for its description of a Chinese factory that made Apple devices. The production, originally advertised as “nonfiction,” came under fire as many learned that facts about the working conditions in the factory were exaggerations.

After some revamping and removal of certain embellishments, the controversial show is coming to the stage at 8 p.m. Sept. 13 at the Performing Arts Center.

The monologue intertwines descriptions of how Apple products are made in the factory in Shenzhen, China, with descriptions of the odd genius of Steve Jobs, going back and forth between startling facts about working in the Foxconn plant and how decisions made by Jobs affected these workers.

The controversy surrounding Daisey’s original production first emerged after Daisey read an excerpt of his monologue on the national radio program “This American Life.” The excerpt reported several details that were later proven to be false, including Daisey’s claim that girls as young as 12 were working in the factory and that the factory was guarded with guns.

Though Daisey drew criticism for his embellishment of the conditions at Foxconn, the factory that assembles all things Apple, the rebooted version of the show has garnered praise since it opened this summer. Charles Isherwood of the New York Times wrote in his blog, “Version 2.0, in my view if anything, is more powerful, funny and engaging than the earlier production.” Isherwood also noted, “the details about the long hours worked and the spate of worker suicides at the Foxconn compound are still both disturbing and well-documented.” Reviewer Andrew Long of the Austin Chronicle wrote that it was a piece of theater absolutely worth seeing.

Despite its controversial past, Daisey’s monologue is particularly relevant to college students, who, as studies show, are increasingly using Apple products for studying and entertainment. According to an article published in 2010 by CNNMoney, 47 percent of college students use MacBooks.

“That statistic doesn’t really match with what I have seen,” business and Plan II student Diana Yang said. “Most of the people I know use MacBooks.”

Freshman Lindsay Richmond also said that Apple products have played an increasing role in her life. “I have a MacBook Pro, an iPhone, an iPod and my family has an iPad, not to mention the old iPods that I have had,” Lindsay said. Yang has around four Apple products as well.

The increasing dependence of society on products such as iPhones is a topic explored in “The Agony and the Ecstasy of Steve Jobs.” Daisey, a self-proclaimed “Apple fanboy” says in the show, “I had never thought, in a dedicated way, about how they [Apple products] were made.”

“I think that it is important for consumers to know some information about how the products they use are made,” Yang said. Cindi Baldi, teaching assistant for the class, Organizational Corruption and Control, said.

“I’m all for the truth about [working conditions] at Apple coming out,” Baldi said. “However, I don’t think a monologue or something presented as theater is the right platform because even if the facts or stories presented have an element of truth, people are viewing it as entertainment and will subconsciously dismiss much of it as fiction.”

UT students will get the opportunity to learn a little more about Apple or a little more about entertainment, depending on their perspective, when Daisey performs Thursday through Saturday.

Do schools kill creativity? Ken Robinson, TED talks lecturer, international educational adviser and author of “The Element: How Finding Your Passion Changes Everything,” argues that they do.

His ultimate point is that “creativity now is as important in education as literacy and we should treat it with the same status.” He points to how schools all over the world champion languages and mathematics far above drama and the arts. “As children grow up, we start to educate them progressively from the waist up,” Robinson said. “And then we focus on their heads. And slightly to one side.” After childhood, studies become more focused on rigid academics and on the less creative side of the brain. Robinson’s analysis can easily be extended to the higher education system. And if it is true, it seems to offer an explanation as to why some of today’s most talented billionaires and geniuses, such as Bill Gates, Michael Dell, Mark Zuckerberg and Steve Jobs, all dropped out of college. Their universities did not offer opportunities to cultivate their creativity, fundamentally innovative ideas and new ways of thinking.

It’s easy to see how that statement is true at UT. Because of various restrictions, it’s almost impossible for students to take classes outside of their respective colleges unless they are required by the University’s core curriculum. It is even difficult for students to take classes outside of their majors but in their colleges because of restricted classes or rigid degree plans.

The current focus on four-year graduation rates does not make it seem likely that these opportunities will expand anytime soon. While an expedient path to graduation is a worthy goal for the University, we have to make sure that students are graduating with improved creative skills in addition to academic skills.

But how can students nurture their creative sides when they are forced to take so many classes to satisfy the core curriculum requirements? In this way, students neither have the opportunity nor the time to take classes they simply find interesting.

What about the pre-med student who comes to college to discover a great passion and talent for music? How would he or she have time to pursue that?

Lynne was diagnosed in the 1930s with a learning disorder because her school found her inability to sit still and learn disruptive to other students. Perhaps she had what we would now diagnose as Attention Deficit Hyperactivity Disorder. Lynne eventually went to choreograph some of the best known musicals in the world, such as “Cats” and “The Phantom of the Opera.” Her school was not encouraging her to pursue her talents.

What if students come to UT unaware of potentially great, creative talents they have? How would they go about finding them? To avoid squandering creative potential and to continue to foster more of it, the University should adopt an open curriculum such as those at Amherst College and Brown University. Both schools have no required coursework outside of specific majors. If that solution sounds too impractical, UT should at least make registration for courses in different colleges more flexible and incorporate more ways for students to make use of all the resources this campus has to offer.

It has been a little over a week since the death of Steve Jobs. In that time we have mourned the passing of a true genius, remembered his numerous and varied accomplishments and replayed clips of that 2005 Stanford commencement speech over and over again. The loss of the man is sad, but the loss of life-changing inventions coming from him is a tragedy. Apple fans everywhere are asking: what now?

How many students and professors walk to class every day with those characteristic white ear buds glued to their heads? How many people have followed the little blue dot on their iPhones’ GPS to get from the Drag to South Congress? How many students in a 300-person lecture class are typing notes, playing on Photo Booth or browsing Facebook on their MacBook Pros? Thousands of people would probably fight, kick and scream if someone tried to take their precious Apple products away, the same way they would if someone abducted their child or was slowly sucked out all the oxygen from a room.

Luckily, our iPods, iPhones, iPads and MacBooks live on, a most fitting legacy to the man in the black turtleneck. But that is all they will do — stick around. We have grown accustomed to an exciting new apple-stamped machine hitting Best Buy at the beginning of every holiday season since 2001 when Apple introduced the first generation iPod. Furthermore, we have grown accustomed to buying whatever exciting new apple-stamped machine is hitting Best Buy, disregarding such mundane things as cost, practicality and need.

Did I need the iPad I got for my birthday last year? No, my HP desktop computer was perfectly capable of running Word and getting me onto Facebook, but it sure was cool to play Angry Birds on a 9.7-inch screen. We bought the first generation iPod touch when it was grossly overpriced at $400 just like we bought the camera-less first generation iPad even though it was widely said that consumers should wait for the faster, cooler iPad 2 with a camera. Even last week when the iPhone 4S was revealed, appearing just about identical to the iPhone 4, first day pre-order sales topped a record-breaking one million. We have adopted every Apple progeny into our lives year after year, iThing after iThing, no questions asked. What happens if the stuff that Apple comes out with is no longer life changing? Can something with that iconic apple stamped on the back be uncool?

I find myself imagining the next 10 years of Apple releases and already being disappointed. Picture an Apple special event in 2021. Senior vice president Phil Schiller, who unveiled the iPhone 4S last week, looks sweaty. Could that be from the hot stage lights or the overpowering nerves? He works that stage back and forth like a pro during the presentation — or is he pacing the jitters out? Finally the moment comes. He has managed to build up the audience of media reporters and technology junkies to a state of tangibly excited anticipation. Maybe, just maybe this new product will be that cutting edge thing that Apple fans have been missing for the past decade of increasingly lame products. At last, the new iPhone flashes up onto the projector screen.

“Here it is, the new iPhone 10 – now in blue!”

We will probably buy it anyway because of some utterly irrational, deeply ingrained dependency on brand new Apple products. We consumers have not yet been able to resist Steve’s siren song for the latest and greatest iThing. Will his death mark the end of this decade of Apple frenzy?

Hansen is a Plan II and public relations freshman 

Printed on Friday, October 14, 2011 as: What are we to do in a world without Steve Jobs?

Last week, the world mourned the loss of Steve Jobs to pancreatic cancer. Unlike many other well-known figures, Jobs’ direct and indirect contributions to society are every bit tangible. He’s the reason the song “You’ve Got a Friend in Me” gets stuck in our head and, consequently, the reason we can pull out a 32-gigabyte testament to human ingenuity to listen to it over and over again. Jobs was an innovator, a visionary and, of course, a college dropout.

That didn’t stop Reed College, the destination of Jobs’ semester-long postsecondary rendezvous, to unveil an honoring of one of its “most visionary former students” on its website.

This kind of phenomenon takes place at UT as well. Last year, the Texas Exes — which, for that matter, does not limit its membership to alumni or even former UT attendees — revealed a list of Extraordinary Exes in celebration of its 125 years of existence. Longhorn legends such as Dell-founder Michael Dell, broadcaster Walter Cronkite, businessman Red McCombs, NBA star Kevin Durant, Olympic gold medalist Mary Lou Retton, Charlie’s Angels icon Farrah Fawcett, former Texas Lt. Gov. Ben Barnes and former U.S. Speaker of the House Sam Rayburn all fall short of being traditional alumni.

And this illustrates higher education’s dropout paradox: that a university’s poster children of success may be the same poster children that critics point to when those individuals are reduced to a number or a percentage of the “did not graduate” persuasion. While their achievements may be boundless, they stand equally degree-less.

Some may point to the paradox as a way to illustrate the insignificance of a university education. After all, it seems as though college was simply a roadblock on their paths to greatness. Yet this assumption misses the well-documented influence universities had on many of the aforementioned dropouts’ successes.

Jobs, in his famous commencement speech at Stanford in 2005, talked about how after dropping out, he stayed at Reed for another 18 months to audit classes. Without the shackles of prerequisites, he credited a calligraphy class he attended as the reason for the Macintosh’s revolutionizing “multiple typefaces and proportionally-spaced fonts.”

Dell and notable Harvard dropouts Bill Gates and Mark Zuckerberg launched their industry-transforming companies from their campus dorm rooms. According to his biography, “A Reporter’s Life,” Cronkite wrote for The Daily Texan and said his first time in front of a microphone was reciting sports scores for UT’s then-radio station, KTUT. Before becoming private investigator Jill Munroe for millions of ABC viewers in the late 1970s, Fawcett modeled for students and faculty at UT’s art department, which got her noticed by several publications. Durant led the Big 12 in points and rebounds in the 2006-07 basketball season that solidified his status as the second-overall pick in the 2007 NBA draft.

Though seemingly non-traditional, these situations simply illustrate what universities have always done best, which is to serve as resource centers for society. Universities serve as points of collaboration, boasting pockets of world-class expertise and resources in very specific areas.

However, what Texas’ recent higher education controversy has shown is the inherent difficulty in translating the intangible benefits of being a resource center into tangible, measurable outcomes. Having a premier conglomeration of top experts in the history of American foreign policy or housing the archives of David Foster Wallace are difficult to measure in dollar, cents and productivity hours.

This is at the root for the push to increase graduation rates. Institutions have significant administrative discretion to create policies that push students to graduate on time. Pledging to increase four- and six-year graduation rates is essentially an agreement between the University and the state that says, “We’ll promise to take care of this as long as you promise to leave us alone.”

The University’s real focus should be on finding avenues for students and the community to tap into and contribute to the institution’s rich resource centers. UT’s Intellectual Entrepreneurship Consortium is a leader in experimenting with creative programs to connect students to those resources, but it would require greater support for it to flourish. The Texas Center for Education Policy works to bridge the gap between community and academia but is more of an exception than the norm. Engagement initiatives like these would enhance and broaden the student experience at the University and better equip it on its mission to work for the betterment of society.

Jobs and his dropout colleagues listed above happened to tap into the university resources that changed their lives — as well as all of ours. Jobs finished his Stanford commencement speech by quoting the last words published in the Whole Earth Catalog: “stay hungry, stay foolish.” Students come into the university with both hunger and foolishness. Let’s not let that go to waste.

— Shabab Siddiqui for the editorial board. 

Steve Jobs, who co-founded Apple Inc. in 1976, died of pancreatic cancer on Wednesday, October 5.

Photo Credit: The Associated Press

Apple co-founder and visionary Steve Jobs died Wednesday, Oct. 5, of pancreatic cancer, Apple announced. Jobs stepped down from his role as CEO of Apple in August, and the newest iteration of the company’s popular iPhone, the iPhone 4S, was revealed yesterday by new CEO Tim Cook.

Jobs, who co-founded Apple with Steve Wozniak in 1976, was perhaps the most high-profile and influential celebrity CEO since John D. Rockefeller. After being fired in 1985, Jobs returned to the computer company in 1996 and ushered in a wave of advancements that would forever change how an entire generation of consumers would think about its relationship with technology and media.

In 2001, under the guidance of Jobs, Apple released the first-generation iPod. It was a thick, brick-like device that had a low-resolution black-and-white screen and five gigabytes of storage space. At the time, it was only compatible with Macintosh computers and retailed for $399.

Ten years later, the current iPod model, the fourth-generation iPod Touch, is comprised of a glossy touchscreen display, can hold up to 64 gigabytes of data, can record and play back high-definition video and features a front-facing camera for video conferencing over the Internet. IPods currently make up 78 percent of the portable music player market share.

The speed at which new developments came from Apple under Jobs’ command helped create a culture of commerce that values immediacy. In addition to its nearly annual refreshment of its product lines, which includes iPods, laptop and desktop computers, tablets and mobile phones, the launch of the iTunes Store in 2003 dramatically shaped how the entertainment industry entered the digital age.

More importantly, Jobs made the crucial distinction that entertainment and technology are inherently tied to each other by the Internet. ITunes was a bold reversal to the pervasive digital piracy of the ’90s and early ’00s — its massive success (now the largest and highest-grossing music retailer in the world, with more than 16 billion downloads) proved that consumers are more than willing to pay for digital content when the program is attractively designed and easy to use.

Design and ease of use became the guiding modus operandi for Apple under Jobs to reach great creative and financial success. The iPhone, perhaps Jobs’ greatest and most influential creation, has defined the mobile device marketplace since its release in 2007. Its sleek, intuitive design, user-friendly interface and unshakable cool-factor has become the standard for consumer electronics.

But the largest reason for the iPhone and Apple’s success is Jobs’ careful construction of his company’s emotional narrative — he made computers and phones feel human. In Jobs’ keynote presentations and in the commercials and advertising for Apple products, the emphasis is laid on how the products foster intimate, almost poignant human connections.

In one of the ads for the iPhone — the first to feature the FaceTime video conferencing technology — a mother and her newborn child conference call with her husband, who is away for work; grandparents get to see their granddaughter’s graduation; and a couple are able to use the camera to speak to each other in sign language. Jobs blurred the distinction between living with technology and living through technology — an inspiring, effective touchstone of a brilliant career. 

Printed on October 6, 2011 as: Apple co-founder, innovator dies at 56

A message is displayed on the window of an Apple, Inc. store in Santa Monica, Calif., Wednesday, Oct. 5, 2011. Steve Jobs, the Apple founder and former CEO who invented and masterfully marketed ever-sleeker gadgets that transformed everyday technology, from the personal computer to the iPod and iPhone, has died. He was 56.

Photo Credit: Jae C. Hong | The Associated Press