Much has been written, - which I’m not adding to here – as well as been said,
filmed (in science fiction or more mainstream media) and even sung – about ‘the
rise of the robot’; often posing questions about when ‘artificial intelligence might
become ‘actual’ intelligence’; in other words, ‘when does a machine, created to
have the ability to think, become sentient?’. When does the line between
‘artificial’ and ‘natural’ intelligence become an irrelevance? Will the robots we
build in our own image rise and destroy us, just as we, perhaps, did with our own
Well, I’m not writing about any of that, not at all.
I’m writing about the polar opposite.
When did we, mankind, start to become cyborgs? Where does the path we’ve
irrevocably begun to journey along, eventually lead?
On December 23rd 1997, I was at a friends house, waiting an hour or so between
journeys to and from South London – we’d gone in the morning to get tickets for
a football match that evening, and then gone back to his parents place, and
didn’t have long before leaving again to go back for the game. This was a world
before you could easily find out journey ETA on your smartphone, and we had
under-estimated how long it would take to get to/from Selhurst Park from
Edgware. if a train was delayed 20 minutes between stations on your way home,
to boot, it wasn’t easy to make an alternative plan – you stuck to the one you’d
made. Incidentally, but not entirely irrelevantly, the entire operation was
doomed anyway, as the game was postponed at half-time due to floodlight
failure – the only reason I remember the exact date this happened.
My friends father was a part-time journalist for MacUser, he was watching a TV
show that caught my attention – it included an interview or speech by the head
of R&D for IBM at that time, who was talking about how humans would be silicon
augmented by the year 2020.
I found this idea fascinating.
We would be silicon augmented by 2020.
In my lifetime.
At the time, I had recently moved on from studying Information Systems at
University, a topic I was drawn to in large part because of the potential
opportunity to study Artificial Intelligence in the third year of the course. I lasted
one term before switching to a degree studying Religion and Theology because,
it turns out, late-night debates about the nature of the mind, philosophy, human
nature and the primary reasons behind most human conflict are actually more
likely when studying Religion as opposed to Computing. At that time, a degree
in Information Systems was really a degree in Computing lite, with a bit of
Management thrown in. Of course, looking back at those early moves from the
world of study into the world of work, I have no sense of irony at all that I now
work as part of a team managing a computing business, having decided to stop
studying that as I was more fascinated with the cultural reasons why we, as
humans, do the very things we do. Then again..... maybe not.......
So – let’s think about this – by the year 2020, humans would be silicon
augmented, according to one of the people who would possibly be best qualified
to make such a prediction in 1997. To qualify that assertion – in 1997, IBM were
probably the worlds leading tech company. Google didn’t exist yet. Apple were
just passing their nadir (they brought Steve Jobs back in mid-1997, which lead to
their recovery). IBM were grabbing the headlines; just two months before Jobs
returned to Apple, their Deep Blue computer had become the first machine to
beat a chess grandmaster, when it beat Gary Kasparov. Any coincidence that
the AI business Google acquired back in 2014 for $500m is called Deep Mind,
and that it literally made a movie (AlphaGo) of its biggest PR stunt – an attempt
to win a game of Go, the worlds most advanced game, against one of the worlds’
So if their Head of Research and Development said that we were to be
augmented with Silicon by 2020, it was a claim worth taking seriously. It rattled
around in my head for years, on occasion. I began working in Digital Marketing
in 1999, having segued from studying Religion and Theology to a post-grad in
Advertising and Marketing (well, I didn’t think many of the big religion companies
were looking to hire at the time, so I had to study something a little more
vocational – also, as I recall, there was a certain ‘lady friend’ I was trying to
impress, in the parlance of the time......but that’s another story......).
By the mid 2010’s, I kinda thought the prospects for the prediction coming to
pass were evaporating like so much spilled Tab Light on a brand new
Tamagotchi. But then, I had an epiphany.
What did being silicon augmented actually really mean?
I had it all wrong for years!
My literal interpretation was that we’d have silicon chips in our arms, or our
brains. We’d become ‘The Man Machine’; we’d be the Robots. But perhaps I’d
got the Model wrong. Did the Silicon need to be physically within the corporeal
flesh for us to attain Cyborg-status?
Let us consider the questions – ‘What is ‘the mind’? What is ‘memory’?’
Biologically speaking, science cannot really pinpoint a part of the brain within
which ‘memories’ exist. Philosophically speaking, can we truly define ‘The mind’
or even, really, the nature of memory? What are your earliest memories? How
many of your earliest memories are truly memories of the things you’re thinking
of, as if they happened yesterday, as opposed to memories of photos of those
things, and your memory is not really of the thing but of a photo you saw
frequently in your youth?
In 2007, Apple launched the iphone. The following year, Google launched the
Android operating system. Today, between the two, they have around 3.5bn
users in a world of around 8bn people, 2bn of whom are children. So let’s say
around half the world’s population, exluding children, own a smartphone using
either apple or google operating systems.
To you own a smartphone? If you do – you, my friend, are a cyborg. Do you
have part of your memory stored on your phone (or ‘in the cloud’)? The photos
and videos of the places you’ve been? The contact details of your friends and
family? Your mode of communication with those friends and family (whatsapp,
imessage, facetime, email, voice calls)? Your plans for the future, in your
calendar, your record of the past stored as 1’s and 0s that you’d never be able to
retrieve if Google or Apple suddenly didn’t exist? Your decision to watch a
certain specific piece of content, at a particular time (ever been at someone
else’s house and seen their Netflix recommendations – always looks weird –
same for content on iplayer, books or general products on Amazon, and of
course, the ads that you see online for products whereby the advertiser, the
content of the ad, the ad placement itself – all are designed specifically based on
what you are genuinely most likely to want/need, based on previous behaviour,
and the behaviours of those like you). Your decisions on how to navigate from
point A to point B – whether on the roads (thankyou, Waze) or through the
streets (no more stuck on the tube, thanks to the wifi on the underground and
So – where does this lead? Why does it matter?
Well – perhaps – the more we do, the more we do. What begins as a crazy
notion, that, perhaps by stealth, becomes reality, perhaps enables more far-
fetched notions to become less far-off, as they don’t feel so outlandish. Life
imitates art, imitates life, imitates art. The Warp Drive, Replicators, Holodecks,
(Flip Phones) Transporters from Star Trek; the Multiverse from Snow Crash; The
Man Machine from Kraftwerk – such ideas were thought up by creators, sparked
by what they saw around them, implanted into the collective (Borg, ahem)
consciousness of our hive-mind, taken up by curious scientists, invented, and
making real the very crazy notions that were previously just ideas for fiction,
which made great reading/writing/music precisely because, Black Mirror-like, it
seemed crazy enough to be extraordinary, but potentially real enough to
become, one day, reality.
Why does it matter? Human evolution? The hybidisation of our species? If the
mind is the start of our transformation, where might it end? Is it too far-fetched,
to apocalyptic to suggest that over the next hundred years of climate-crisis, if we
could replace our physical dependence on oxygen and organic food for survival,
if we coul increase the capabilities of these physical vessels in which ‘The I’
exists, perhaps we remove the imminent threat (no, I’m not advocating that as
the answer – it makes far more sense to try to fix the mess we’ve created, rather
than ignore the mess and adapt ourselves - but maybe avoiding the issue might
be more pragmatic from where we sit right now – versus trying to stop it - before
it destroys us)?
Well – I don’t know.
One thing I do know.
We are silicon augmented.
We are Cyborg.
It just happened without most of us realising it had done so
Our intelligent, data-as-a-service platform gathers massive amounts of data in a fast, efficient, and cost-effective manner via transparent and trustworthy pathsRon
“Intuizi has relied on NVIDIA hardware and software from the very beginning,” said Ron Donaire, CEO of Intuizi. “Being able to work directly with NVIDIA will help us deliver the most advanced and cost-effective AI and ML tools to unlock insights hidden in our clients’ data and show how it relates to other data sets.”
“The Silicon Review 30 Fabulous Companies of the Year 2022 program identifies companies that not only have the most innovative, diversified, and reliable solutions, but also have a self-evolving and self-adaptable quality to best serve the ever-changing needs of the market and its customers,” said Sreshtha Banerjee, Editor-in-Chief of The Silicon Review Magazine
"We are glad to include Intuizi as one of the Top 10 Florida Tech Startups 2022.”— Hanna Wilson, Managing Editor of HR Tech Outlook Magazine
Location Data Platform market to display unparalleled growth over 2021-2026 Competitive outlook: Intuizi, GroundTruth, CARTO, HERE Technologies, Adsquare, UP42, TomTom, Quadrant.io, OpenPrise Data Marketplace and CleverMaps