I remember when I got my first cell phone in the seventh grade. It was a Nokia 5110.
The device was a revelation. Before I got it, I had little option other than to twiddle my thumbs if I was bored on a road trip or waiting for a doctor appointment. Now, I could apply those finicky appendages to the dial pad and guide a pixelated snake on a feeding mission through a little square box.
I could call my friends from anywhere. No longer was I tethered to a phone wired to the wall. My father shunned the device. He couldn’t understand why someone would want to be reachable by phone at all times.
He felt the same way about computers in the home. We bought our first computer years after my friends’ families jumped on the Windows bandwagon.
In high school, I upgraded to a flip phone. This thing featured a beautiful colored display, expanded games and an awesome new technology called text messaging. For just 10 cents per text, you could send messages straight to your friend’s phones. It was incredible.
My father thought otherwise, unable to understand how I could justify spending a dime on a text when I could call them for free.
I told him that he was a victim of a generational divide bred by the invention of the Internet. That his refusal to adapt to new technologies was because he and other baby boomers grew up in a world without connected devices.
I felt bad for him, but found personal solace in the idea that I would never suffer the same fate. I was fortunate enough to grow up with technology being an integral component of my life. I would always keep up with the latest trends.
At least, that’s what I thought. Now, I find myself in an endless struggle just to keep pace.
What is this Snapchat thing and why is it getting so many teenagers in trouble? Why would someone make the conscious decision to look like a cyborg while wearing Google Glass? Who needs a watch that’s connected to the Internet?
In raising these questions, I had an epiphany. It took 40 years for my father to be cast from the mainstream. At 25, I teeter on the precipice of succumbing to the same fate.
My struggle to keep up is not unique. In fact, I would say that I do a better job at incorporating new trends into my life than most people in my age group. But it’s getting harder and harder to stay at the forefront of relevancy.
With this in mind, what does the future hold if generations are becoming more and more segmented?
In 2001, Ray Kurzweil articulated his vision for the future in his theory called The Law of Accelerating Returns.
Kurzweil, a prominent inventor, futurist and Director of Engineering at Google, suggests that the rate at which new technologies are developed and adopted will increase by approximately 200 percent over the next century.
“An analysis of the history of technology shows that technological change is exponential, contrary to the common-sense 'intuitive linear' view. So we won't experience 100 years of progress in the 21st century—it will be more like 20,000 years of progress (at today's rate),” says Kurzwell.
While the exponential nature of technological development opens new avenues for innovation, medical discovery and greater global interconnectivity, it also presents a serious set of problems.
Primarily, mass adoption of new technology and trends lags behind innovation. As the innovation cycle continues to accelerate, consumers must adapt to new ideas and technology as they spread throughout our culture more quickly than ever before.
This has serious social and economic implications when you consider the innovation adoption cycle.
According to Rogers’ model, only 16 percent of the population are considered early adopters of new trends or technology. Conversely, half of the population resists adopting new trends until the near-end of the cycle.
As the timeframe for adapting to new innovations becomes shorter and shorter, those who are late to the party risk being left behind altogether.
The problem is that staying current on digital and societal trends requires more than sheer will. You have to have the means and access in order to invest in them. Simply stated, it’s an expensive proposition. But in many ways, the investment is absolutely essential.
In analyzing the impact that the emergence of the internet has had on contemporary commerce, it becomes clear how important it is to cultivate technological literacy.
What is amazing is that even in the United States, there are huge segments of the population who do not possess the necessary technological skillsets to compete in the current job market.
A report commissioned by the US Department of Commerce underscores the urgent need to address this problem.
The report begins by saying that “researchers and policymakers recognize that availability and use of high-speed Internet services – a range of connection technologies collectively known as broadband – are essential to economic growth.”
However, the report goes on to say that 60 million Americans don’t use the Internet.
According to the Washington Posts’ analysis of the report, “74 percent of urban households use the Internet, versus 62 percent of rural households. That gap gets even worse when it comes to broadband adoption -- 72 (urban) versus 58 (rural) percent. Non-users tend to live in the southeast.”
The Commerce Department finds that people are deluded if they believe they don’t need the Internet, suggesting that it “has a measurable impact on employment, income, consumer welfare and civic engagement.”
Still, many either make the decision to shirk new technologies or aren’t provided with channels to access them.
“We recognize more work needs to be done to ensure that no Americans are left behind,” said John B. Morris Jr., director of Internet policy at the National Telecommunications and Information Administration, the division of the Commerce Department that authored the report. “Increasing the level of broadband adoption is a complex, multifaceted challenge with no simple, one-size-fits-all solution.”
Clearly, there are significant hurdles to ensuring that everyone can remain actively engaged in the modern economy. And many of those hurdles can be found in our education system.
Economically depressed communities and school systems have limited access to the digital technologies they require in order to sufficiently prepare their youth for the modern job market.
According to the Pew Research Internet Project, “56 percent of teachers of the lowest income students say that a lack of resources among students to access digital technologies is a ‘major challenge’ to incorporating more digital tools into their teaching; 21percent of teachers of the highest income students report that problem.”
This is disturbing for the fact that many young people entering the job market haven’t learned the requisite skills needed to maintain a competitive edge.
The Obama Administration has sought to address this problem by pouring more than $7 billion dollars to expand Americans’ access to high-speed broadband Internet. Still, many cite a lack of digital literacy skills as being their reason falling behind.
“The job I’m trying to get now requires me to know how to operate a computer,” said Elmer Griffin in an interview with The New York Times. Griffin, a 70-year-old retired truck driver from Bessemer, Ala., was rejected for a job at an auto-parts store because he couldn’t use a computer. “I wish I knew how, I really do. People don’t even want to talk to you if you don’t know how to use the Internet,” he lamented.
Administration officials and policy experts are troubled by the 60 million Americans who are already shut out from an economy that’s becoming more digitized.
These individuals don’t have access to jobs, services and healthcare opportunities that are increasingly turning to electronic platforms.
With a significant segment of the education system failing to account for this trend, we now face the prospect of alienating the young and old alike in the modern economy.
This effect is even more startling when you analyze it in a cultural context. I found myself bewildered after reading a Salon article titled “Too poor for pop culture.” The article shed light on what life is like for impoverished people living in East Baltimore, a community that too-well encapsulates this disturbing trend.
D. Watkins, the articles author, tells the story of a conversation he had with some friends from his Baltimore neighborhood.
“A yo, Michelle was gonna beat on Barack for taking dat selfie with dat chick at the Mandela wake! Whateva da fuk a selfie is! What’s a selfie, some type of bailout?” yelled Dontay from the kitchen, dumping Utz chips into a cracked flowery bowl. I was placing cubes into all of our cups and equally distributing the vodka like, “Some for you and some for you …” “What the fuck is a selfie?” said Miss Sheryl. “When a stupid person with a smartphone flicks themselves and looks at it,” I said to the room. She replied with a raised eyebrow, “Oh?”
It’s amazing how the news seems so instant to most from my generation with our iPhones, Wi-Fi, tablets and iPads, but actually it isn’t. The idea of information being class-based as well became evident to me when I watched my friends talk about a weeks-old story as if it happened yesterday.”
While this anecdote might sound shocking, we are swiftly approaching a new reality in which this kind of cultural disparity between classes is more widely experienced. It used to be that 40-year-olds and teens couldn’t understand one another. Looming now is the prospect that 20-somethings won’t even be able to relate to their own contemporaries.
It’s amazing that I find myself yearning for a simpler time where the game Snake was flashy and cool at the age of 25. I wonder how I’ll feel when I’m 30.
Photo credit: Getty Images