top of page

Cognitive Offloading: Is this time suck burning you out?

Updated: Nov 6

By Dr E K Wills

 

time to sort it out!

Ever had a great idea and needed to flesh it out?

Or been given a 10 page report and needed to distill it down to succinct points?

 

Well now AI makes this possible and in fact is so pervasive that 40% of people report receiving AI generated, convoluted or overly wordy information that lacks clarity. This is such a problem now that it has been given a name: workslop.

 

I run a private practice, an online startup and have a family of five so I need my day to be effective and efficient. When I see reports or posts that have that distinct AI generated touch such as lots of asterisks, emoticons or emojis I almost find myself doing an eyeroll and notably switching off. I find that I have less trust in what the author is telling me if I have to read it or otherwise I move on.

 

This level of low-effort, unhelpful work is called 'cognitive offloading' and is happening across industries: prominent in professional services and technology. This also adds to burnout when the receiver needs to figure out what to do with it which wastes valuable time and contributes to frustration levels.

 

Often workslop has what is called ‘purple prose’ where multiple paragraphs of text could be said in one bullet point. This is often because it was generated from one point. Then the receiver has to reduce it to that original point to get the essence of the information. Worse still, the convoluted message results in loss of the original point.

 

This loss of productivity increases the burden of the amount of work in a workday. Research indicates that people who have reported this, spend an average of almost 2 hours dealing with confusion of workslop. Estimates based on salaries of those people put the cost to productivity in the millions ($US9 million).

 

When mental energy is required to figure out how to address the low-quality work, trust is lost in the message, and the sender of the information. This contributes to annoyance, confusion, impacts work relations and increases the potential for burnout.

 

Organisations would benefit from having strong leadership in this area by guiding workers and promoting quality in the work as well as having policies on AI use.

 

AI can supply quality augmentation to work if it is used responsibly and with the knowledge of a human to guide the output. This was particularly evident with the previous versions of ChatGPT where the bot was sycophantic rather than critical of prompts, often generating rubbish or even ‘hallucinations’ that are incorrect manifestations of data, in order to supply something rather than nothing.

 

Importantly, we as users of AI, need to be mindful of what we are putting into our work and passing on to others and ensure that we deliver quality not just more workslop.

 
 
 

Comments


Blog

bottom of page