Section Eight: Science, Non-Fiction- #ausdigitalfuture

by Lori Dwyer on June 20, 2012 · 7 comments

“Never trust anything that thinks for itself if you can’t see where it keeps it’s brain.”
Arthur Weasley, Harry Potter and Philosopher’s Stone.


Almost a year ago, I transferred my entire CD collection to iTunes… which proceeded to wipe every single song when I upgraded to iOS5.

I spent two weeks quite literally grieving for eighteen years worth of music– although my CD purchasing has dwindled over the last few years and came to a complete stand still once I initiated myself as an iPerson; the last CD I bought was Pink’s FunHouse, which wasn’t really released that long ago.

I found, while I wasn’t even looking for it, the box I accidentally hadn’t thrown out; the one containing my entire CD collection– some in cases with booklets pristine, others rolling listlessly or hanging just connected to their brittle plastic cover.

I have never been so glad to see a pile of scratched, skipping CD’s before in my entire life; and probably never will be again.


According to Mr Phil Ruthven, there is a time capsule buried on the grounds of a certain Australian university that, if everything goes to plan, will be retrieved in the year 2054. One of the objects our unborn relatives will find is a simple letter opener.

Imagine the confusion it will cause…

Unless, of course, objects such as that are archived somewhere right here, online, in the endless realms of the Internet.


The IBM Snapshot of Australia’s Digital Future was a mind blowing event, in the literal sense of the word– I felt my sense of normality being stretched like spun sugar, encompassing the positive reality of our future that the panelists (sadly, all men) projected. I also had a head spinning moment of “Who let the mummy blogger in here?!”, but that’s, you know, my issues.

I won’t bore you with the GDP’s and figures that go over my head (except to say that Australia’s mining boom will be over by 2020), but it seems that here online we’re safe– the industries that will die off in the next 50 years, aptly christened as ’dead men walking’ on Twitter,  include traditional media such as print newspaper, books and magazines; radio broadcasting, computer software production and various entertainment sectors such as cinemas and movie rentals.

All theses industries are deemed as nonviable against the ‘infotronic’ age, and, unless they find a way to adapt and mold with new technology, they will be rendered useless.

Speaking of new technology, check out this quote directly from the report, which can be read online here.

“Device interfaces will merge with the human body: eventually, we will receive information on contact lenses, our skin will become a touch pad and we will use brain interface machines to command devices via neutral control. While in their infancy, breakthrough in these areas have already been achieved, laying the beginning for human beings to merge with the digitalised world.”

And also…

“..smarter homes become primary care-centres and enable early detection and prevention of illness.”

Terrifying. I know. But it seems that’s just proof that I’m getting old.

One theory presented by Ruthven but softly disagreed with by IBM Australia’s managing director Andrew Stevens, was the idea that the traditional division amongst the Australian people– the country versus city tribes– will dissolve as people are allowed to become more flexible with the introduction of advanced technology and giga–speed broadband (which Singapore and Finland already have, by the way… Hello national broadband network. I need you in TinyTrainTown). He told the assembled media that the term ‘employee’ and the virtual psychical imprisonment that comes with it would be not only obsolete, but a foreign concept by 2050.

A discussion panel including both Ruthven and Stephens as well as Mark Pesce and social analyst Neer Korn weaved a picture of the future that sees us as our own virtual work stations– most people working from home will ease the divide between rural and city areas in Australia. While Pesce detailed the trend of smart phones bringing services to people and creating a virtual marketplace where none existed before, thus flipping the economy and our traditional ideas of service; Korn spoke of a basic drive that will become more evident as our technology grows– needing to switch off from constant cyber communication and enjoy ‘real’ experiences and face to face human contact. Social meeting places were mentioned, tech free zones where people, isolated from working at home all day, could gather in the afternoons and chat, face to face.

The utopian hippy within me sings with that one– a way to enhance tribal communities, once we figure how out how make all this new stuff seem a bit less obtrusive.

The concept of community and the way it will adapt was mentioned a few times, especially in reference to the difference between ‘older’ and ‘younger’ generations. It was pointed out that young people today are embracing technology, where as the older we grow the more afraid and tech-illerate we seem to become. While the younger generation are happy to embrace online socializing, those in or above the Gen X sub-class tend to view increasing screen interaction as the very nemesis of social connectivity. It was predicted as around this split that the biggest divide that will occur– younger generations will embrace new technology, where as we will hang back and be afraid. The awesome concept of a digital Zimmer frame, complete with touch pad and wifi, was mentioned– a metaphor for the abstract concept of the ‘help’ stations that will be needed at various points in order for the old geeks (that’d be us, I think) to easily access all this new, mind blowing uber–cool Back To The Future type gear.

The single most disturbing concept to come out of this forum was, as I mentioned, the idea of your skin becoming a touch pad and your information stored in a contact lense. The general reaction to this when I posted it on Twitter and Instagram was intense distrust– which kind of confirms the ’ancient and terrified’ theory. I get it– I’m terrified, too. But as a whole, the picture presented at the Digital Snapshot was an entirely positive one. It will take work, we were assured, but I don’t th
ink that will be a problem– if nothing, Australians know how to work our guts out. And with that hard work will come a shift in the way we live, a swing backwards, making this life that we’ve made so difficult for ourselves a bit easier.

The report classified the industry groups in Australia into categories of those that won’t survive this new digital age, as mentioned above, those that will be slightly effected, and those that will undergo a complete overhaul, a total shift in the way their business is done– healthcare, social assistance, retail, education and mining, to name a few. (There was a lot of chatter about mining at this presentation. For those non–Aussie readers, there is always a lot of chatter about mining when talking business in Australia– though, to be honest, it’s virtually invisible to most of us, except that we all know someone, generally a twenty–something male, who has gone to work for months at a time in the open cut’s in WA, soaking up huge wages because there simply is nowhere to spend it, being flown home for one weekend a month.)

A dividing point amongst mummy bloggers has always been along the lines of “wont somebody think of the children?!”– wondering how much we write will be cached, accessible forever on the interwebs. I’ve always been more worried about the opposite– the Internet disappearing over night and eating everything I have here. It’s online, not real life… but this web page is a part of me.

With split servers and clouds and the Google God knows what else multiplying like Gremlins, that worry is beginning to fade; and I guess I get what everyone stresses about– the digital imprint I leave behind, akin to a house packed with stuff; little bits of information that piece together my life.


My mum is a science fiction fan and I remember reading a short story by one of her favorite authors, Anne McCaffrey, when I was a teenager. It involved a class at a university, given the task of checking microchips for important information before deleting their contents.

The microchips belonged to people deceased– this was set in an imaginary future where every house, each individual, had their own microchip embedded within the walls with which to store whatever data they liked. (Even the imagination of a science fiction author can’t quite grasp it, I’ve noticed, even though they get so close– microchips, because retina scans and contact lenses are just too far fetched; Stephen King’s futuristic Running Man, one of the Bachmann books, sees the main character years into the future… carrying a hefty, heavy video camera and large tapes to go with it, then sending them via snail mail to his suppressors).

The protagonist of the story- female, as most of McCaffrey’s heroes where- walks in on her classmates sharing immature enlightenment at the very mundane, seemingly ridiculous content on the microchips of ordinary people.

I don’t remember many of the details from there in. What I do remember is our heroine making her speech, her point, that the very basis of this technology was the intrinsic human elements it carried; and proved that by recounting an anecdote of a woman who recorded her husband snoring just to prove to him that he did… then played the recording back to herself on a loop every night for years after he died, because she couldn’t sleep without it.

Digital proof of life.

Reading that story in the mid nineties, it seemed as it was– fantasy, a story, fiction, flight of fancy.

Now, in the year 2012, it might as well be reality– it’s not only the concept of a digital society we’re looking down in to, but the actual human society it encompasses.

Look at that synopsis of that short story again. And then tell me what you’ve got stored on your iCloud…?


I’ve made the executive decision to try this whole transferring CD’s to useable music thing again, backing them all up on my external hard drive. At least it has space for a brain, should it want one.

And I’m undecided as to whether to keep the CD’s themselves. I think I will, just for the retro value… to see what funky things my grand kids can up–cycle them into.

Or maybe just for that letter opener effect– to leave them wondering what the f*ck these shiny, circular bits of plastic were used for, anyway.

post signature

Leave a Comment

CommentLuv badge

{ 7 comments… read them below or add one } February 7, 2013 at 12:57 pm

I personally wish to save this specific article, “Section Eight:
Science, Non-Fiction- #ausdigitalfuture” on my
web site. Do you mind in the event I reallydo? Thanks a lot -Jerilyn


Melissa June 20, 2012 at 8:34 pm

When I think about the drastic changes that have occurred in my lifetime – I'm a little nervous as to what changes my children will see.


Lynda Halliger-Otvos June 20, 2012 at 3:53 pm

F%^#k. You are a good writer, Lori, you make this technical stuff so damn understandable…. thank you.


em June 20, 2012 at 2:21 pm

I am way too blind to read all that tiny font.


om June 20, 2012 at 12:03 pm

ah, so no more hard software ;) I do wonder where all the power to fund this worldwide broadband and computing will com from; perhaps we need to magically harvest all the heat absorbed by the miles and miles of empty asphalt when no-one commutes any more…


Lori @ RRSAHM June 20, 2012 at 11:26 am

Hey E :) I think, by computer software production they mean the production of actual disks and USB's that contain software- everything will be downloadable online and a physical copy won't be needed…


E June 20, 2012 at 10:23 am

As an IT professional (of the computer software writing, and researcher variety) one comment in there made me really go .. "HUH?"

'the industries that will die off in the next 50 years, aptly christened as ’dead men walking’ on Twitter, include … computer software production'

So who's going to be enabling this mighty future they're speaking of.. that all runs on computers? None of that will happen without the production of software and human-computer interfaces to enable it. How did they justify that statement?


Previous post:

Next post: