Over the past year, I’ve consciously noticed my attention span begin to slip. Long articles are often interrupted with me opening social media. Deep working sessions give way to checking my texts. I’ve found that whatever it is I’m doing, my brain yearns for the shorter-form version, such as reading Twitter as a break from a larger essay. The constant barrage of short-form videos and personal content tailored by the algorithm makes escaping to a place of focus ever the more difficult. The positive feedback loops of these platforms makes them better with every interaction, making it even harder to put it down when there’s a task that requires my full attention.

Despite this, I do feel as though my generation is one of the last to have some semblance of self-restraint from short-form content. My parents installed our first family computer when I was five or six years old and I was only allocated a certain amount of time online per day. My first cellphone was a flip phone when I was 12. And my first smart phone probably came around the time I started high school. Each of these device upgrades felt socially motivated, whereby a critical mass of my peers were all onboarding to the next platform at the same time. At the time, each felt like a major shift in how we interacted and consumed information. The recent iterations of personal devices are far more incremental and I suspect that it will feel that way until wearables and/or vision-based systems reach a level of performance where the whole hardware paradigm inflects. It took the smartphone years of iteration before the iPhone broke open the category — we’re still waiting for the iPhone moment in wearables and virtual reality.

I believe this step-by-step rollout of indistinguishable consumer hardware upgrades is a product of a lack of a need to innovate due to dominant social acceptance. Everybody is used to the way that an iPhone feels and the expected experience of run-of-the-mill social applications. A quantum leap in design or functionality would likely be met with strong resistance, even if it results in an intrinsic improvement. The critical factor here is that the adoption curve has plateaued, and is now being replenished by new users entering the market. Importantly, these new users aren’t churning off an existing product, they’re being born.

As of 2015, smartphone adoption in the US was pretty linear from the ages of 8 to 15, at which point 71% of children had a phone. In 2021, 71% of children had a phone by the age of 12, with a 34% increase in ownership from ages 11 to 12. More staggering is that US tablet ownership is 22% higher in households with children under the age of 18 than those without. Tablets were more common in households with children even when controlling for various socioeconomic characteristics that affect computer ownership. The National Survey of Children’s Health in 2020 reported that 26% of all US children spent four or more hours per day in front of a non-school related screen. The American Academy of Pediatrics recommends limiting young children’s exposure to screens to one hour per day.

Children are inundated with content from a very young age. Excessive screen usage can lead to academic underperformance, problems in social-emotional development and competence, and poorer attention and focus. I find the data striking and anecdotal evidence provides strong reinforcement. TikTok videos often include clips from a television show or movie next to a video game like Subway Surfers or Grand Theft Auto. Short-form content is being further segmented to enhance dopamine rewards. It’s this multi-tasking across media which has accelerated the decay of attention.

I do believe that there are plenty of techniques for parents to limit screen time, although I worry many choose not to. This trend has collided with the rapid pace of continued technological advancement, including artificial intelligence, which adds additional shortcuts for academic and professional growth. It’s important to not entertain the idea of technological deceleration, so in my opinion, the best course of action is to adapt and embrace these realities. Kids who grew up in tablet families are still participating in higher education and securing conventional post-grad jobs, but I wonder if the future waves will require fundamentally different career options to compensate for screen-induced low attentions. What could these low attention careers be?

I’d expect the careers for people with short attention spans to skew towards rapidly evolving, dynamic environments which require distributed focus across a number of different tasks. My mind initially goes towards traders at banks or hedge funds. However, with higher technical skill set requirements and ever-increasing competition for seats at these firms, the low chances for success (and beating the market for that matter) may deter potential candidates. That said, people who develop outstanding programmatic skills may have their choice at algorithmic-driven firms, but even then, these positions are typically reserved for an elite mind, not for a large swath of the population.

Brand identities will remain important, and as new consumer trends emerge, they will depend on chronically online people to stay with the times and speak directly to a specific audience. In recent years, we’ve seen otherwise innocuous organizations push the boundaries of appropriate corporate marketing. Wendy’s and the Buffalo Bills are two franchises which have leaned into this, amassing millions of followers across social media thanks to their campaigns which may seem bizarre to those not in the mix, but are relevant to those who are. With screen times increasing, I’d expect this meme marketing to become more prevalent across more dispersed brands, and for large groups of people to aim at landing roles to supply it.

Lastly, I have thought about how the government may want to capitalize on this all. Army advertisements on television make service for one’s country seem to play out like a Call of Duty match while operating weaponry has evolved to mimic the familiar gestures of video game controllers. Perhaps it will be the government that subverts the bastion of niche culture by exploiting low attention spans for propagandist initiatives. I wouldn’t be surprised if we see specific recruiting channels target the new online class for the good of the country.

None of this is to say that traditional career paths will go extinct — I still think the lure of prestige and money will attract ambitious yet lost students who see the herd and follow. However, the baseline traditional corporate job will likely find itself falling back on a disinterested talent pool, requiring a fundamental shift in the way companies hire and operate. Whoever is able to get ahead of this trend and position themselves to either outcompete their peers by remaining focused, or start a business that turns low attention into a competitive trend, will be in a strong position for success. I’m not sure what today’s average 12 year old will make a career in, but I do think it’ll differ greatly from those just a half generation older.

Attention