blog

If you like perspectives on topics across the design and visual communication industries - from strong opinions on evolving practices to practical guidance for students, professionals, and clients alike - this blog may be for you! It's a space dedicated to sharing insight, exploring ideas, and contributing to the ongoing dialogue around creative work.

The Hidden Costs of Grey Work

Graphic by Alan Smithee

Imagine your typical morning at work. I'm sure it involves sitting down at your desk and mentally preparing yourself for the day ahead. You know where you are and where you need to be by the end of the day, so you take a deep breath and...

Three hours later, you've cleared your inbox and notifications, tweaked your calendar, cleaned up your desktop, and researched at least five new productivity apps - but your actual work, the stuff you get paid to do, remains untouched.


Real work is slow, messy, quiet, and iterative; all deeply human traits. It happens only when we step away from dashboards and notifications, which means it can't be measured in a spreadsheet — but it's infinitely more meaningful.


Welcome to the world of Grey Work.

Grey Work, whether you like it or not, is now what we're forced to do to pretend we are more innovative, forward-thinking, productive, and team-oriented at our jobs. The sad part is that Grey Work isn't really work at all; it's tech-enabled busywork disguised as productivity. And it's killing your focus, creativity, well-being, and ultimately, your soul.

The DNA of Grey Work and the Great Lie

Grey Work generally includes, but isn't limited to, endless Slack replies, click-looped research that leads nowhere, managing project management tools (yes, you read that right), and optimizing workflows more than actually using them. Essentially, it's anything that looks like work and feels like progress but isn't—because it's not. In short, it's the work we do to get to our real work.

The effects of Grey Work are legion: lost time from switching contexts and tackling shallow tasks, the constant erosion of deep, focused creative effort, the perpetuation of the insidious myth of multitasking, and an always-on culture. All of this leads to suffering. Projects suffer, strategies suffer, creativity suffers, and ultimately, people suffer.

The statistics on Grey Work are shocking. A recent study showed that nearly 75% of employees spend over 20 hours a week searching for information across various technologies instead of focusing on their actual jobs. This means workers spend up to half of their workweek on Grey Work, often relying on makeshift solutions and workarounds caused by disconnected tools, systems, and processes. As a result, workers frequently push their real work into after-hours and sacrifice personal time to keep up.

All of this leads to anxiety from always being behind, even when you're technically busy. The overdose of constant dopamine hits - that glorious beep-blop-boop of just one more notification - and the mental cost of doing work that doesn't feel meaningful or complete all manifest, as far as your brain is concerned, in fatigue, dread, and guilt.

The rise of Grey Work has led to a steady decline in work-life balance, and the irony is that the very tools marketed as efficiency boosters often create more to manage. Dashboards, checklists, and constant notifications masquerade as innovation, but in reality, they add layers of complexity and micromanagement. As a result, productivity is no longer measured by meaningful outcomes, but by visible activity—an endless cycle of proving busyness rather than making progress.

When Work Becomes a Game, Value Vanishes

Grey Work doesn't just clutter your day; it also reduces the worth of your effort. When decisions can be canceled, rescheduled, or postponed with a click, they lose importance. Meetings become optional, deadlines negotiable, and human interaction (once the foundation of accountability) gets replaced by dashboards and notifications.

When responsibility is diluted, so is meaning. What once required commitment, conversation, and follow-through now feels like a low-stakes game of moving pieces around. The result: effort itself is devalued, and the work that matters most is the first to suffer.

The Solution

I have an unpopular yet forward-thinking and innovative approach to this issue: fewer tools, more humans. If you have the power in your life to lead or help shape the conversation about how work flows, support anyone willing to reclaim focus by intentionally removing technology when possible. If you need a checklist, here it is:

1. Replace technology with human-centered activities whenever and wherever you can.

With the rise in popularity of remote work, this isn't always possible. But it should be a priority. Meetings can be just as disruptive if not planned, managed, and executed correctly. However, one in-person meeting or phone call can be worth 50 Slack messages and about 100 group emails.

2. Write on paper.

It doesn't sync to the cloud, and some will argue it doesn't allow for collaboration. And I say, good! It also doesn't require charging, and you don't need an internet connection to use it. Writing your thoughts and taking notes on paper leads to better retention, and you remember far more than you would with a keyboard.

3. Prioritize human work.

We can get so caught up in early adoption and the often hollow promise of technology's benefits that we forget how robust smaller systems relying on critical thinking, collaboration, creative problem-solving, and physical making can be.

4. Value quality over visibility. Outcome over optics.

What destroys productivity faster than anything is that, within the storm of Grey Work, performance is no longer measured by results — it's measured by how visible our activity is. Playing the game becomes systematically more important than showing real progress, at least mentally.

5. Return to real work as soon as humanly possible.

Real work is slow, messy, quiet, and iterative; all deeply human traits. It happens only when we step away from dashboards and notifications, which means it can't be measured in a spreadsheet — but it's infinitely more meaningful.

A Call to Action

My advice is to start by auditing your day to see how much of it is actual work versus gray work. Once you know that, decide what technology to cut and what tasks can be reclaimed by your mind, your hands, or a simple conversation.

Ultimately, we must recognize that we are not machines; we are humans, and whenever possible, we should strive to work like them again. Our nervous systems weren't built for constant noise, endless notifications, and being constantly "on." It's too much.

When the brain faces constant stress, it switches to survival mode, leading to short-term thinking, chronic anxiety, and a loss of perspective. This isn't a discipline issue; it's an overload issue.

A Reminder of the Fight for Better Work/Life Balance

The right to rest, dignity, and fair treatment in the workplace wasn't given to us - it was fought for. Through strikes, protests, and generations of collective effort, workers drew a clear line between their personal lives and their labor.

Yet, here we are, decades later, allowing technology to erase that line quietly. Work seeps into our homes, weekends, and minds. We answer emails at dinner. We check Slack on vacations. The boundary between work and life is no longer protected; it has become blurred.

If we want to honor the fight for workers' rights, we must defend what it achieved — not just with words or social media posts about balance, but through real action.

We've earned more than just a break. We have the right to reclaim our personal time.



Thoughts on AI

Graphic by Alan Smithee

I'm not an AI expert, not by a long shot. In fact, I barely understand it, but as a creative professional with over three decades of experience, the topic of artificial intelligence comes up daily. Where do we begin with something this big and uncertain? Perhaps with the simplest of clichés...

AI is just a Tool

AI is just a tool. Like any tool, its value depends entirely on how we use it. A hammer can build a home; however, it can also, when used maliciously, take someone's life. Similarly, AI can enhance our creative endeavors or threaten livelihoods; The latter is a valid reason for concern.

This has all Happened Before

Technology transforming creative industries isn't new. I began my career in the mid-90s, a pivotal time when computers disrupted traditional graphic design methods. I recall cutting rubylith, creating color separations, hand-drawing comps, and performing manual paste-ups. My grandfather experienced an even more abrupt shift. He was a layout artist at a small-town newspaper in northern Minnesota, spending decades carefully assembling each edition by hand. The day after he retired, his entire department was replaced by a single Macintosh running QuarkXPress operated by one recent college graduate. Technology didn't just streamline processes; it eliminated skilled roles overnight. But to that small-town newspaper, they saved hundreds of thousands of dollars a year with a very small investment in a new, seemingly innocent technology.


Because copyright today covers virtually every sort of human expression - including blog posts, photographs, forum posts, scraps of software code, and government documents - it would be impossible to train today's leading AI models without using copyrighted materials.


Similar disruptions occurred with Photoshop, democratizing photo manipulation, and digital typography in QuarkXPress and InDesign, diminishing the role of expert typesetters. AI is following the same pattern. It offers powerful capabilities to a broader audience. Consequently, it displaces established professionals in its wake.

Legal and Ethical Challenges

But let's not sugarcoat this. AI's utility is uniquely tied to ethical and legal issues, notably copyright infringement. AI trains on existing human creativity, whether it be the written word, art, photography, illustration, code, and so much more. Sam Altman, CEO of OpenAI, acknowledged this explicitly in testimony to the British Parliament:

"Because copyright today covers virtually every sort of human expression - including blog posts, photographs, forum posts, scraps of software code, and government documents - it would be impossible to train today's leading AI models without using copyrighted materials."

This embarrassing admission highlights the ethical issue. AI must draw from humanity's collective creative output to generate something new, and it's doing so without explicit consent or compensation.

This is a problem and the reason copyright law exists in the first place. Copyright law is designed to provide creators with exclusive rights to profit from their works. Financial rewards motivate authors, musicians, filmmakers, software developers, and artists to invest time, resources, and talent into creating original works. More importantly, copyright law grants creators control over how their work is used, reproduced, and distributed, enabling them to effectively monetize their creative efforts. Without copyright protections, third parties could easily profit from creators' works without compensation (ahem).

Denying artists these protections harms us as a society. Creators need to be able to publish, perform, and disseminate their work with confidence. Simply put, copyright law exists to incentivize creators while the public benefits from widespread access to creative works, ultimately leading to greater innovation, cultural advancement, and societal enrichment.

The flipside of the argument is that copyrighted materials eventually fall into the public domain. Our copyright law framework openly acknowledges that creative works should be protected for a limited time, allowing creators to benefit and recoup costs. However, these works also need to return to the public domain so they can continue to foster further creativity, innovation, and iteration.

However, AI's business model is based on bypassing that process and jumping to the front of the line. They seek exclusive access to everything now so they can iterate on it immediately and exploit it for profit while presenting it as a public good, with copyright law and creators being disregarded.

A Hint of Hypocrisy

To be entirely fair to the argument, there are examples of humans doing this the old-fashioned way. Good artists borrow, but great artists steal. Isn't that what Picasso said? He is right; art survives and grows on inspiration and iteration. So, does it matter if a machine does it? I can't help but think about the Supreme Court's decision in 2023 regarding the Andy Warhol Foundation for the Visual Arts, Inc. v. Goldsmith case. It was a significant case, emphasizing that even prominent, well-known artists like Warhol are subject to copyright limitations. In short, the Court voted in a 7-2 decision siding with Goldsmith that Warhol's use of her photograph was not sufficiently transformative enough to qualify as fair use. An even more apt example is Shepard Fairey's iconic Obama "Hope" poster based on an Associated Press (AP) photograph by Mannie Garcia. The case was ultimately settled out of court, and Fairey agreed to share profits and future royalties with The AP. Still, the court heavily hinted that the poster had violated copyright law.


Andy Warhol’s “Orange Prince” is based on Lynn Goldsmith's Photo of Prince

Shepard Fairey's iconic Obama "Hope" poster is based on an Associated Press (AP) photograph by Mannie Garcia.


I don't have the time or the intelligence to write about every nuance regarding inspiration, style, and homage. I'm simply making the point that humans do this stuff, too. We borrow ahead of our time, cut corners, and use the inspiration of existing works to iterate. And it seems that sometimes - not all the time, but sometimes - we look the other way if a human hand is the violator.

It's important to mention that, as of the writing of this post (May 2025), in the United States, works generated entirely by artificial intelligence without meaningful human input are not eligible for copyright protection. This is grounded in the idea that copyright law protects "original works of authorship" created by human beings. The U.S. Copyright Office and federal courts have consistently upheld this interpretation, emphasizing that human creativity is a fundamental requirement for copyright eligibility.

Congratulations, humans - for now!

Convenience and Other Things

We can say all we want about our protection under the law. Still, the internet, another displacing technology, has greased the wheels for AI and conditioned us to expect speed, convenience, and accessibility over craftsmanship and skill, which AI capitalizes on perfectly. In the creative space, it creates a "McDesign" economy where instant solutions replace processes that once required skill, expertise, and insight. This phenomenon isn't unlike how McDonald's succeeded by prioritizing speed and convenience over nutritional quality and culinary artistry. Design is not and will not be the only victim of this; all industries will suffer, and experts and thought leaders will be pushed aside as we all become those things with a few keystrokes and a well-worded prompt.

Is Good Design Dead?

Are we sacrificing quality in the pursuit of convenience? While AI might produce faster and more accessible results, it often lacks the depth, originality, and thoughtful design cultivated by human creativity and the creative process. Yet society typically prefers the path of least resistance, even if it comes at the expense of diminished quality.

There has always been, and will always be, a large population of people (clients) who don't know how to judge or care about quality in the context of their messaging. They will always gravitate toward the immediate and frictionless approach of having an idea now and realizing it in five minutes. I fear that a creative space will emerge which considers the lack of quality in AI-generated ideas but embraces that limitation and builds concepts around it, much like how enterprising creatives used stock photography in the 90s. Some of it will be good and clever, and those ideas will serve to justify the rest, eventually becoming a viable approach that is normalized over time.

AI's Place in Our Creative Future

Despite fears and criticisms, it's clear AI isn't going anywhere. However, technology adoption isn't always predictable. Consider 3D televisions or virtual reality - both highly touted innovations that failed to gain broad cultural traction. They were cumbersome and required too many extra steps to use reliably. AI could face similar hurdles or unforeseen limitations, at least in its current form. I do not doubt that AI will assist mathematicians and scientists in curing cancer, inventing a better light bulb, or solving other critical human problems. But the jury is still out on whether it can provide accurate information or help me organize my calendar with any degree of accuracy.

Nevertheless, embracing AI with open eyes and cautious optimism is wise. AI can automate repetitive tasks, enhance creative workflows, and enable professionals to concentrate more on conceptual innovation and strategic thinking. As Kevin Kelly, co-founder of Wired magazine, once said:

"Machines are for answers; humans are for questions."

Perhaps the key lies in humans leveraging AI to handle routine tasks, freeing creative minds to pose more profound and insightful questions. However, we must hold ourselves accountable for managing the technology properly, a skill that we as humans seem to lack.

Wrapping it up

Technology is continuously reshaping how we live and work, possessing a unique ability to enhance our lives while also presenting new challenges.

Ironically, even as I write this post, I am leveraging multiple technologies to refine my thoughts: Google for research, word processing software to check for grammar and spelling errors, and I will publish this directly on my website—all of which eliminate the traditional need for research assistants, proofreaders, or an editorial entity to endorse my work. Furthermore, I will trust AI (for this article) to ensure my message is clear, concise, and coherent, eliminating the need for an editor. Finally, I will distribute this article via social media without involving a marketing professional or PR specialist.

Had I employed all the seasoned professionals I mentioned above, they undoubtedly would have identified errors or suggested improvements that I and all these technologies overlooked or failed to address. Yet, the reality remains that this post likely would never have existed had I followed the traditional publication path. This post demonstrates that technological advancement simultaneously empowers creators while disrupting established norms, providing people access to opportunities they typically would not have had. This leads us to one final existential, perhaps unanswerable, question.

Is it ultimately beneficial to have such abundant access to technology? Or does unlimited availability lessen the quality and impact of our collective output?