Welcome to the era of Human Intelligence

Late last month I travelled to Queenstown, New Zealand, to deliver a speech to a group of property managers and investors on the topic of technology-driven change and the choices we face (thanks to Dinesh Pillutla and the team at Core Property Research for inviting me to speak).

It was an interesting experience, not only for the chance to delve into an industry that is itself undergoing significant change, but also because the speech that I gave wasn’t the one that I had first set out to deliver.

Having spent more than 20 years swept up the rapid changes of the technology sector, it is easy to forget that people outside the sector have a very different perspective on the role of technology, and a different appreciation for what it can and can’t do. As a speaker, this makes it all too easy to bamboozle an audience with demonstrations and prognostications of the technological utopia/apocalypse ahead – which might be entertaining (or unsettling) in the moment, but which holds little value over the long term.

This time I set out to take a different approach. My main thesis was that while technology is evolving quickly, we are putting our focus on the wrong things, in that we are focusing too much on technology and what it can do, not enough on what we want it to do.

In short, we need to stop thinking so much about technology, and start thinking a lot more about ourselves.

So when it came time to talk about AI, I choose to talk about something that technologists rarely talk about – human intelligence – and the skills and abilities that we already possess (and should be thinking about more) when it comes to understanding our role in a future world where AI is a major factor.

Why? Because getting from the first Australian computer (CSIRAC, built in 1949) to today’s AI took less than 75 years. We have gone from basic machines to a simulacra of intelligence in the blink of an eye. Evolutionary biology took approximately 750 million years to complete the same task.

It’s an impressive achievement, and not something we’ve needed to be overly concerned about – until now. Throughout history the development of technology has mostly been in support of human endeavour, and has tended to create more opportunities that it has erased. Now we may have reached a point where instead of supporting us, technology is competing with us (or more precisely, we are competing with it), and given its rapid evolution, we will fall behind quickly.

This is something we have seen time and time again throughout history – especially in sectors such as manufacturing – but now the emergence of more capable AI systems means that field of competition has broadened considerably. The most common question I get asked in any conversation about AI is ‘will AI take my job’. And the answer I give is most often ‘yes’ – it’s just a question of when.

At some point many of the jobs we do today won’t exist, but the expectation (still – and far from proven) is that more new ones will be created. The key for us as individuals is to anticipate which roles AI will perform better than us – and by when – and then work out what we need to do to ensure we stay relevant in that AI-focused future.

Hence the need to think more about human intelligence.

So in my presentation in Queenstown I posed the question of whether my audience would find their jobs replaced by AI, and then answered with a provisional ‘yes’ – that being if:

  • You had lost your sense of curiosity.
  • You were unable to listen and learn from diverse perspectives.
  • You cannot elevate yourself out of your immediate environs to see the bigger picture.
  • You lack empathy.
  • You are unable to align to others.
  • You cannot communicate.
  • You have stopped learning.
  • You are not adaptable.

If those traits describe you, then there is a very good chance that you will find your job replaced by AI. But it only takes the exercising of a few of those skills to provide a foundational capability that will help you maintain or grow your value in the turbulent years that lie ahead.

In summary – we need to be worrying a lot more about the exercising of our own human intelligence than we are worrying about the artificial kind.

No one can predict the future. We can make inferences and predictions, and we can run the risk of being very, very wrong.

But even though we can’t predict the future, we can consciously change the future through the actions that we take today.

And that is a capability that no machine can match (at least not yet).

ENDS

Why you really need an AI strategy

Last month I had the pleasure of joining a panel session at the Municipal Association of Victoria’s MAV Technology conference, to discuss the challenges and impact of AI.

Not only was it a chance to sit alongside luminaries such as Adam Spencer, Lisa Andrews, Morris Miselowski, and our moderator Holly Ransom, but it was an opportunity to explore exactly what AI means for local government – which it turns out, is not dissimilar to what it means for many other mid-sized organisations.

The key question I considered when going into the session was whether an organisation such as a local council actually needs an AI strategy.

My conclusion was a resounding yes.

Despite its label, AI is a very human challenge – one that can create fear and uncertainty among workers, customers, and communities.

Having an AI strategy doesn’t mean developing a complex technological roadmap for the creation of AI systems. What it does mean is being able to articulate how an organisation is using AI today and its guidelines for how it will use AI in the near future.

Many of the applications for AI have come into common use almost by stealth, such as unlocking a smartphone using facial recognition, or using predictive text on a word processor, or shopping recommended items on a website. AI has been a part of everyday life for many years – it is only the accessibility of Chat GPT and similar generative AI tools and their ‘wow factor’ that has thrust AI into the spotlight.

This sudden rise to prominence has created a lot of questions – principal among them being “will AI take my job”. This is quickly followed by “should I use AI to help me with my job”, “should I be feeding data into an AI to improve its usefulness”, and “what are the privacy and copyright implications when I do?”.

These questions are only the tip of the iceberg, and they are being asked by executives, managers, and workers all around Australia. Without an AI strategy, where can they turn to for answers?

For local government, there is also the need to answer the questions of rate payers, many of whom may be concerned by the use of AI and how it might impact their privacy and other rights.

The use of facial recognition without consent is already a contentious topic, and the Robodebt scandal has further eroded people’s trust in government and its use of technology. Recent months have also seen many council meetings interrupted by people who are concerned about how technology is being used today to manage communities – and how it might be abused in the future.

At the very last, an AI strategy needs to consider:
– Guidelines and commitments regarding where AI will or will not be used, in alignment with expectations of privacy and human rights. This needs to be specific in relation to the use of AI in activities that involve with the general public (chat bots for customer service, automation of penalty notices, use in video surveillance, etc) and should provide clarity for staff whose working lives may be impacted by these technologies (such as customer service agents).
– An inventory of where AI is being used today, and why. This may require an investigation of existing software applications to determine their own use of AI.
– Clarification of decision-making processes and guidelines to be used when making future investments in AI based technologies.
– Guidelines for managers and staff as to which AI services are cleared for use, and for what purposes.
– Further guidelines regarding how different data types can be used in relation to AI systems.

This is not an exhaustive list, and the creation of a strategy should start with the creation of a stakeholder group that can work though a more comprehensive set of considerations.

AI has massive potential to do good things for local government, such as improving services and reducing their cost of delivery. But as with many fast-developing technologies, the potential for backlash – and very real damage – is equally strong.

Why talking quickly takes your nowhere fast (and what to do about it)

Photo credit: Morgane Le Breton

As a communications trainer, there is one piece of advice that I find myself offering up more than any other. It’s also a piece of advice that I find the hardest to implement.

Slow down.

Rapid-fire delivery is one of the most common crimes committed by speakers, be it on stage, in interviews, or in general conversations. It is also one of the most likely reasons why your communication efforts may not be having the desired impact.

Just because you can speak quickly, that doesn’t mean your audience can listen quickly. And there is no chance they will retain or be influenced by what you’re saying if they can’t keep up with you.

I know this from the repeated experience of being on the receiving end of fast talkers. As a journalist who sometimes records interviews, I’ve heard things in the recordings that I never heard the during the interview. I’ve even asked questions that had been already answered earlier.

When you speak quickly, your listener will hear a few components of what you have to say, but they are unlikely to retain much of value. Like a stone skipping across a pond, you offer no chance for your words to sink in.

Listening is not a passive process (Oscar Trimboli can tell you a lot more about that).

When a person is listening, they are also often learning something new, and trying to assimilate that knowledge with what they already know. By speaking too quickly, you fail to give your audience time to absorb what you are saying, which can lead to them falling behind and quickly losing interest.

Fast speakers have offered me plenty of excuses for their rapid-fire delivery. For some, it is a habit they developed early in life that they have found hard to break. For others, fast speaking arises from nervousness, and the feeling they need to say everything they need to say before they forget it. And for some, it comes about simply because they are excited and have a lot to say and are trying to cram in as much information in as possible (which is the excuse I most often give myself).

None of these excuses alleviate the suffering of the audience, and no matter what the reason, fast speaking will always mean you are less likely to influence your audience in the way you wanted to.

Unfortunately, there is no fast remedy for fast speaking, other than enforcing a stricter discipline over your delivery speed. Trust me, I’ve looked.

There are however some techniques you can use to make things easier for yourself and your audience.

The first is to pause every now and again. This works especially well on stage by providing a moment for your audience to catch up.

The second is to repeat the things you most want your audience to hear. This tends to also work best in presentations, but can also be used to great effect in interviews or even conversations – just don’t overdo it. You might also want to come back to key ideas several times in different ways (for instance, placing them in context using examples) to ensure they sink in.

Neither strategy is as good as simply slowing down though, and that means being conscious of your speed of delivery and taking the steps needed to moderate your flow.

The great thing is, slowing down not only helps you’re audience, but it also gives you greater control over what you are saying. Slowing down enables you to think further ahead, as your brain is able to catch up to (or get ahead of) the words coming out of your mouth. This means you can start to guide the discussion, and you can also buy yourself the cognitive capacity to pay more attention to your audience and their non-verbal responses.

And you also are likely to become more economical with your use of words. Faster speakers tend to use more words than they need, as they are using ‘verbal polystyrene’ to pad out what they are saying and buy time to think about what they really want to say.

Slowing down gives you more time to think about what you are saying, which can see you using less words to say exactly what you need to say in the same time it would take if you were talking quickly.

Most importantly, slowing down creates a better experience for the audience – and that is crucial if you truly want your words to have any chance of changing the way they think, feel, or behave.

Why good communication starts with a simple equation

Pic by Jonas Jacobson

At the heart of all professional communication lies a very basic equation, whose solution goes a long way towards determining whether your efforts succeed or fail.

When you are telling a story for professional purposes, you are asking for something from your audience. Initially you want their attention, but what you often really want is the chance to change the way they think, feel, or behave.

However, if you want something from your audience, then you had better be giving back something in return. And if you really want to be successful, then you need to make sure they believe that what you are giving them is worth more than what they are giving back.

Whether you are delivering a presentation, being interviewed, sitting on a panel discussion, or even writing a blog post, the equation is still the same. If what you are giving is worth less than what you are getting back, then you soon won’t be getting back anything.

This equation is obvious in media interviews, where the person being interviewed generally wants the amplification and authority that comes with speaking to reputable media outlets. Journalists just want good stories (I always did), but we are not the group you are really trying to influence – which is unfortunate, as without an audience, stories can have no impact. Fail to give the audience something of value, and no one gets what they want.

It is also true in sales engagements, where it is obvious that you want something from your audience – i.e., their money. If someone can’t see the value in listening to you from the outset, then you’re going to find reaching your goal becomes harder and harder.

Exactly what an audience might want varies from situation to situation. In some instances it is knowledge of current events. In others it is insight and education. At other times they might simply seek entertainment. But in all instances, if the audience is not getting something of value, then they won’t be an audience for long.

This value equation applies in all forms of communication, such as on-stage presentations, blog posts, or even sales and marketing material. No one is going to pay it any attention unless the value of doing so is established very, very early.

Journalists act as proxies for our audiences, by putting ourselves into their shoes and considering the questions they might like to see asked and the things they might want to learn. The better we are at this, the better we become at connecting with and building that audience. One of the primary reasons I’ve declined interviews throughout my career is that I saw no value in them for my audience

But for any professional communicator, understanding the needs of the audience is critical. If you don’t know who your audience is, then how can you know what they want or what they need – and therefor how can you be sure to be delivering anything of value?

Understanding your audience is one of fundamental elements of good communication. There is never any excuse for not putting in the time to research your audience appropriately, and failure to do the appropriate research is one of the primary reasons why communication efforts fail to achieve their goals.

Most people are polite, but if they are only listening to you out of politeness, that can hardly be classed as a successful outcome.

Only by understanding your audience can you define what value they might be seeking, and align what you can offer to them. It’s surprising how often delivering greater value to your audience upfront will be rewarded in the long term.

The metaverse, marketing, and neurotech – a match made in a dystopian nightmare

In an era where privacy has been steadily eroded, the one sanctuary that most of us have held on to is the privacy of the thoughts within our heads.

But it seems even this last redoubt might soon come under siege. Because while for centuries now psychics have claimed the ability to read minds, now we are making this capability real, thanks to rapid development in the field of neurotechnology, and specifically, the creation of brain computer interface (BCI) devices.

But once more it seems the pace at which we can develop new capabilities is going to outstrip our ability to consider and manage the consequences.

So what is a BCI? Put simply, it is a device for sensing and interpreting the signals of the brain. Where common neurotechnology devices such as MRI scanners can determine what parts of the brain are active at any given time, a BCI device can determine what the brain is actually doing – or more specifically – what it is thinking.

The detail and accuracy of BCI devices is astounding – down to the level of individuals words. A trial of a BCI device in 2021 on a person who was paralysed and non-verbal saw them use an implanted BCI device to communicate at a rate of 18 words per minute at 94 per cent accuracy. While the techniques used suggest there is still some want to go to true mind-reading (this example focused on imagined muscle control), this is another step along a seemingly inevitable pathway.

Today the capabilities of BCI devices greatly depends on the proximity they can achieve to the neurons they are trying to sense, with the best results achieved using implanted devices where electrodes are inserted under the skull, such as in the example described above. Good results have also been achieved from devices implanted under the scalp, and even wearable (non-invasive) devices are showing promise.

Exactly how accurate these wearable devices will prove remains to be seen, but given their use is mostly unregulated (especially as they are not ‘medical’devices), there is a good chance that a lot of investment dollars will be keen to see how finely their resolution can be tuned (Elon Musk certainly seems keen).

But the implications of BCI technology go far beyond giving speech to the speechless. Creating a devise that enables one party access to the thoughts or another has massive implications across many aspects of life.

Take marketing for example. Not only might a marketer be able to see through the difference between what a person thinks and what they say, but they could also be able to pick up on signals and make suggestions based on thoughts that a person might not even be aware of. This would be a much more accurate form of contextual advertising, based on evidence rather than inference.

Whether any individual might be willing to submit to constant mind surveillance by their favourite brand is unlikely – although with the right incentive, not impossible. However, there is one scenario where BCIs are likely to play a major role – the metaverse.

One of the key barriers to truly immersive virtual reality experiences is the control interface, which must use hand and body gestures as proxies to control actions within the virtual world. Using a BCI however means a person might only have to think about ‘running’ in a specific direction, or about picking up an object, or any manner of other interactions, for that thought to be translated into an action in the virtual world.

How much of a stretch is it to go from monitoring a participant’s commands to interpreting all of the other data that the BCI is extracting?

One immediate application is contextual advertising, and the ability to present a brand at the precise moment when a person is thinking about that product category.

For content platforms, whose job is to keep people engaged, the BCI can be used to present content which has been determined as being most likely to garner a response at that moment in time. Given the furore that erupted when Facebook was shown to be manipulating people’s moods through the content it showed them, it is not hard to see the possible harm that might result.

Or what about for an online casino, which now knows exactly what it needs to offer to keep a player engaged and spending?

While none of these possibilities are viable with the BCI technology available today, at the current rate of progress, this decade is the one where the boundaries will be tested – not some distant and unforeseeable future.

So what will the world look like when not even the thoughts in our head are ours alone?

If you’re interested in learning more about the technical, legal, and ethical challenges of neurotechnology, then please come along to the second Neurotechnology Forum, taking place in Sydney on May 17.

CMO – How to include disabled communities in marketing

Disabled Australians eat fast food, wash clothes using laundry powder, and even drive cars. But looking at the people used to promote these products in advertising shows not a single disabled person in sight.

The tendency towards only featuring able-bodied people in advertising might be defended based on the law of averages (as the average Australian person is not likely to be visibly disabled), but this runs against the spirit of inclusivity that many brands preach. It also ignores the reality that one in six Australians have a disability.

In this article for CMO Australia I had the chance to explore the topic of representation for disabled Australians in mainstream advertising, and speak to some of the marketers that are working to bring greater representation to a diverse group of Australians.

You can read more by clicking here.

Neurotechnology – testing the limits of intelligence, ethics and the law

Amidst all the discussion regarding generative AI brought on by the release of ChatGPT and its cousins, another intelligence-based technology may well have an even more profound impact on human life.

Like AI, neurotechnology is a field of research which has existed for many decades, and which has also made some fundamental leaps forward in this decade. Chief among these have been breakthroughs in the use of electronic sensors to detect and decode human thoughts, opening new possibilities for direct control and communication with digital systems, and unleashing a plethora of moral and ethical considerations.

For example, in 2021 researchers demonstrated a system which implanted neurotechnology into a person who was paralysed and non-verbal, and which enabled them to use this brain control interface (BCI) technology to communicate at a rate of 18 words per minute with up to 94 per cent accuracy.

Such technology holds enormous promise through providing new communication options to those who can’t speak, or command and control capabilities to those with limited physical ability, or perhaps one day to give sight to the sightless. This technology can also enable a range of sensations in remote communications, such as bringing a sense of touch or smell to interactions within virtual worlds.

Importantly, these are not possibilities of a distant future – some are being tested today – and as interest and funding grow, so too will the speed of the breakthroughs.

But while the underlying technology promises incredible benefits, it is not hard to imagine less savoury use cases, such as the delivery or novel forms of coercion, or for extracting information and confessions from alleged criminals or political dissidents.

The rapid emergence of neurotechnology and its implications for ethics and the law was the key focus of a world first seminar hosted by Jewelrock in Sydney in December 2022 and supported by the law firm Baker McKenzie. During the session numerous global experts on neurotechnology discussed recent breakthroughs and the implications, highlighting the enormous work required to understand and prepare for the future that neurotechnology will deliver.

A second seminar is now planned, as part of a series that will continue through the year and into 2024.

I was fortunate to be invited to write the official report for the event, which you can find by clicking here.

While the true impact of neurotechnology is hard to fathom, one thing is clear – while artificial intelligence may have the spotlight today, we have barely scratched the surface when it comes to technology’s impact on human intelligence.

Bridging the digital divide in 2021

If you are reading this blog, then there is a good chance that the Internet was one of the key tools that helped you cope with the events of 2020. While Australians sheltered inside through the lockdowns (and in some parts of the country, continue to) we turned to the Internet in record numbers to buy groceries, order food, stay connected with friends and loved ones, and entertain ourselves. The Internet also provided the lifeline that enabled many people to keep working even when they couldn’t get to their offices, and was vital for ensuring that kids could stay connected to their schools.

As frustrating as the COVID crisis has been, imagine what life would have been like had it happened in 2000, when much of Australia was still offline, and those who were online were struggling with dial-up Internet speeds?

Now consider that here in 2021, approximately 2.5 million Australians are still not online. That’s 2.5 million people who are not buying groceries online, or watching Netflix, or Zooming their friends and family. That number also includes an unknown number of people who were not working from home last year, and most concerningly, a lot of children who were not participating in online education.

As much as we might take the Internet for granted these days, for some Australians it is still expensive (both in terms of access costs, and the price of devices needed to connect to it), and for many, hard to use. There are many reasons why a person might be one of the 2.5 million, but principle amongst them are their economic situation, age, educational background, physical ability, and digital skill level.

And as more and more of Australian society moves online – and takes our social discourse into online platforms – the less of a voice this group has in social debate.

We call this the Digital Divide, and while it is a topic I remember writing about in the mid-1990s, it is well and truly present in Australia today.

Thankfully, there are organisations out there who are committed to closing this Divide, many of whom are represented in the membership of the Australian Digital Inclusion Alliance. It’s also been heartening to see that some commercial organisations are also starting to realise the extent of the Digital Divide, and begin looking at it as both a social obligation and a commercial opportunity.

Recently I had the chance to check back in on the state of Australia’s Digital Divide, and the work that is being undertaken to bridge it, through my writing for CMO. Click here to read the complete article.

Emerging jobs in data science and AI – a webinar for Monash Tech Talks

There is a common fear that accompanies the emergence of almost every new technology that its introduction will put someone out of a job. While this is undoubtedly true in some instances (modern sewers put the nightsoil collectors out of work, for example), what is also true is that new technologies also create new jobs.

We can see this in Australia today, where the digitalisation of the economy is creating shortages of related skills, especially in fields such as analytics and cyber security. And while even some of these jobs will eventually be automated, it is highly likely the new skills will be required along the way.

The key question then for any individual that wants to have a long and interesting career is to ask what these new jobs will look like, and what skills will be needed to perform them.

I recently had the pleasure of putting this question (and many others) to a panel of experts for the Monash University Tech Talks series. I was joined by the Director of the Centre for Learning Analytics at Monash (COLAM), Professor Dragan Gasevic, along with Digital Architect and Advisor Rita Arrigo, and the Head of Service Delivery at the Silicon Valley-based AI company SKAEL, Ragu Mantatikar.

Together we discussed how are AI and data are being used across different industries, whether AI will really replace jobs, what new jobs are emerging from our world’s obsession with AI and data science, and what skills professionals need to gain to future-proof their careers.

If you want to watch the replay, you can find it by clicking here.



New presentation stream – Beyond COVID-19, and why we are a long way from reaching the ‘new normal’

We can’t yet begin to accurately estimate the impact that COVID-19 will have on the Australian economy or our society, nor can we realistically guess at what 2021 will hold. With much of the economy being propped up by stimulus payments, and the threat of the pandemic likely to linger in our consciences for many months yet, there are simply too many variables in play to allow anyone to accurately model the behaviours and outcomes we will see in 2021.

But there is nonetheless plenty we can do to prepare for what lies ahead. The strategies for managing crises are well known, and there are plenty of signals we can examine that will tell us at least what lies on the near term horizon.

This week I had the opportunity to explore some of these themes in a presentation for DXC, as part of its webinar Adapting to customer channel disruption during a crisis. Hosted by the Director of Customer Experience at DXC Oxygen, Marco Formaggio, and with fellow guests including BGW Group Services’ General Manager David Dean and Suntory Frucor’s Fernando Battaglia, we discussed the importance of investing in both people and technology as a means of delivering great experience.

The need to deliver great experiences for customers is unlikely to diminish any time soon, even if our ability to invest comes under pressure through having to tighten spending during the looming post-COVID recession. One possible outcome is that the next six months will provide the impetus many consumers will need to switch brand loyalty to better value options. We are already seeing consumers holding back spending, and more of their purchasing journey now will start and end through digital channels. All of this tells us we need to think differently about customer experience, brand value, and customer journeys. It is also possible that the post-COVID world will lead to the emergence of new customer segments and new behaviours. Managing through 2021 will require a fine balance and some careful planning as to where investment dollars can best be deployed, and that means that having a strong capability to sense the state of the market and the intentions and moods of customers.

The rising preference for great experiences has been a common theme for B2B and B2C organisations alike over the past decade. But providing the best possible experience under normal circumstances just won’t cut it in any more. Great CX is not about what an organisations delivers in the main – it is about what that organisation does to step up when things go wrong. That is where the strongest relationships are formed, and right now, there is a lot that can go wrong.

I was the third presenter in this webinar, and my contribution starts around the 30 minute mark. It contains a snapshot of a much larger body of work I’ve been compiling as I’ve been discussing strategies for 2021 with numerous organisations – something that I’ll be building out over the next few months.

One thing is certain – for all the talk of the ‘new normal’, we are still many months away from reaching what might be classed as a steady state for the Australian economy. The current post-pandemic period is the artificial creation of government stimulus spending, and we have no idea what life will be like once that is wound back totally in 2021.

What we do know is that for most organisations there is plenty of hard work ahead, despite their workforces already being fatigued by the events of the past six months. But by investing in people, building a sensing capability, and giving themselves the best possible opportunity to make the right investment decisions, they will be able to find ways out of this.

Unfortunately from an economic and societal perspective, things will get worse before they get better, but they will get better. Just how well positioned an organisation is for those better times depends greatly on the work it does today.