by Keith Richardson
Wow! June already! Spring has sprung, election’s done, excuses spun, time for fun. Yes, I know, it’s always time for fun—whether we feel like it or not….
Eight years have flown by since this column first appeared in Today’s Senior. Much to reflect on. Gains, losses, friends made, friends who’ve passed on. Mostly very uplifting experiences.
Still, there are signs that, like all good things, MacSeniors, too, is coming to an end. “To you from failing hands” and all that. Of course, someone will come along (if [s]he hasn’t already) to replace us. We look forward to witnessing growth and development in their era.
Computers are associated with constant change. Gordon E. Moore, “visionary co-founder of Intel,” observed back in 1965, that since their inception in 1958, the number of components in integrated circuits (aka ICs, “microchips, or silicon chips,” not be confused with Pringles explosion-triggering-chips;>)) had doubled every year and predicted that the trend would continue “for at least ten years.” Well, the reflection evolved into “Moore’s Law” which posits that the number of transistors on IC’s doubles approximately every two years and “the trend” is still happening although we may be reaching the physical limits of such miniaturization.
Moore’s Law also showed that computer performance doubles every 18 months. As computer processing power increases exponentially even while computer components shrink in volume, the cost of owning one of these much faster, much more capable devices decreases substantially even as research and development costs spiral upward. Economy of scale from rising sales keeps prices down.
So are we about to reach the limits of all this growth (or shrinkage) any time soon? Like most predictions, answers depend on the assumptions one makes. One assumption we should NOT count on is that human ingenuity is about to tank. Though life as we know it MAY tank, women, men, and children, somewhere, as our cousins around the globe have done for millennia, will find, sooner than we blink, new means to revolutionize technology and human experience—whether old-timers like it or not. The gap will only grow between those who “get the future” and those who allow themselves to remain “mired in the moment”.
How can we fossils survive our times? I recently re-read an re-issued an article from TechRepublic that quoted Louis L’Amour [paraphrased slightly]: “(s)He who ceases to learn is already half dead.” It goes on to say: “Technology and organizations change, and with those changes comes the need to adapt accordingly. Once we cease learning, technology [and those continuing to learn] pass us by.” It goes on to offer “12 tips” to aid our learning, “regardless of subject.” I’ve rearranged the order and reworded to suit my purposes.
“Set realistic goals. We have a greater chance of mastering material if we set goals— write them down; be realistic so we don’t get discouraged but force ourselves to “stretch” to achieve them. Find a personal connection.
Break things down. The way to eat an elephant, so the riddle goes, is to do so one bite at a time.
Don’t just read; take notes. Learn actively rather than passively. Frequency trumps duration in learning sessions.
Leverage what we know. If we’re trying to learn an apparently new concept, think about how that concept relates to something we already understand.
Retain the knowledge by applying what we’ve learned. Write or talk about what we’ve learned. Better yet, teach what we’ve learned—discover just how much we really comprehend.
Have someone review our insights.
Use an iPod. It plays more than just music. Lectures, for one thing. Listen while traveling or waiting in line.
And, to put all this into perspective, find and (re)read A Canticle for Leibowitz (© 1959)! Good luck on the rest of your journey.
Extra! Extra!
Meant to share this with you last month, but couldn’t fit it in. If you read the Vancouver Sun or related press, you may have seen an article by Paul Kendall of the London Daily Telegraph back in March that the Sun headlined “The digital world is changing our behaviour — for better and worse.”
Kendall claims that: “Technology appears to be having an impact on development. One study shows that young people have skills their predecessors lacked, such as finding and filtering information, responding to stimuli and doing fast analysis.” Talk to most grandparents (or parents for that matter) and they’ll tell you that young junior is just a whiz with computers. Kendall’s analysis, however, goes far beyond computer literacy.
He asserts that “heavy multi-taskers have skills their predecessors lacked. They are adept at finding and filtering information, responding to stimuli and doing fast, incisive analysis. As “digital natives” who have grown up with the Internet, they are used to technological change, while “digital immigrants” who grew up before the Internet, find it hard to keep up.”
On the negative side, however, they tend to be “much more easily distracted by ‘irrelevant environmental stimuli’ and less able to maintain their concentration on a particular task.” Sound familiar?
He further cites a study at Stanford that “discovered students prefer to text a classmate down the hall in their dormitory rather than talk in person because it is “less risky” and “less awkward.” So they don’t learn how to read facial expressions or navigate “real world” social situations.”
Is this also happening to some of the most tech-savvy seniors? What are the implications, we wonder, for inter-generational conversations? I know that my 11 year-old granddaughter, a lovely child, has told me that she feels bothered by phone calls or emails, but “it’s okay to text me. I’ll answer right away.” Do you have similar experiences?
The Sun article goes on to explore “the navigational skills of 18-to-30-year-olds…without the aid of a satellite navigation device,” the frequency and quality of handwritten messages, ability to remember phone numbers (and researchable data in general). I’ll let you guess what the results show. You’ll also find assertions that the “Internet encourages procrastination,” that “we’re becoming less empathetic” (because we don’t read novels, we read superficial drivel), and that the Internet is feeding addictions.
There’s more, much more. Certainly worth a look and some reflection .and discussion with intelligent members of younger generations. Just like the conversations we didn’t have with our parents and grandparents, oh about 50 or 60 years ago….
Find the article here:
http://www.vancouversun.com/entertainment/being+rewired/8083034/story.html