Attention, Ambition, and AI: An Honest 2025 Reading List

“Kindle”

For the last ten years, when someone asked me if I eat healthy, work out, sleep enough, or read, I’d say, “Not as much as I want to.” Now that my kids and company are out of their toddler years, the hard truth is that I do all of those things exactly as much as I want to. In January, I realized I was chronically staring at my phone in the dead zone between my kids’ bedtimes and my own. So I swapped scrolling (or some of it, at least) for reading. No big plan or personal challenge, just an experiment. The habit stuck, and I haven’t read this much since college. I’ll spare you book reports and unsolicited advice on why you should read more. What I do want to share is my journey through an eclectic pile of books and the themes that emerged, from the perspective of a tech co-founder and father trying to re-build a reading habit.

Winter

With a desire to escape the algorithm top of mind, I started with How to Do Nothing (Jenny Odell), a quick read about rejecting the attention economy and pressure of being constantly productive. My mind quickly turned to fantasies of building a new, distributed internet ecosystem where humans could interact outside of hyperscale platforms. I’d name it “Pera,” after the imaginary utopian island Odell describes where your attention is truly free. (Turns out there is no shortage of open-source projects with this goal.) Clearly, my monkey brain had not yet internalized the lessons of this book.

From there, I went to The Rigor of Angels (William Egginton), which I heard about from MIT Professor Peter Fisher’s keynote at a conference in 2024. It drew me in with its personal stories about physicist Werner Heisenberg, triangulating his intellect with Kant and Borges, all of whom wrestled with the limits of what can be known. For Heisenberg, that was the famous Uncertainty Principle. Borges wove it into mind-bending, paradoxical short stories. This sent me directly to the anthology Labyrinths (Jorge Luis Borges), a book given to me by a friend possibly twenty years ago, sitting on a bookshelf until now. (If you’ve read Labyrinths, you will appreciate the awesomeness of finding a dusty paperback in your basement whose first story involves a mysterious book.) When it seemed things couldn’t get more absurd, I got to “Pierre Menard, Author of the Quixote”, which poses the question, if two authors from different places and eras wrote the same book word-for-word, are those the same stories? It got me thinking about AI and the significance (or perhaps insignificance?) of people putting content into the world, some created by human talent, some by chatbots. Then Borges drops this:

“To think, analyze and invent…are not anomalous acts, but the normal respiration of the intelligence. To glorify the occasional fulfillment of this function, to treasure ancient thoughts of others, to remember with incredulous amazement that the doctor universalis thought, is to confess our languor or barbarism. Every man should be capable of all ideas, and I believe that in the future he will be.”

If that was a prediction, I’m speechless.

Spring

I put up numbers in Q2, stringing together quick wins across four genres. At work, I had a breakthrough in Document AI for manufacturing drawings, and I was struggling with the fact that a service we built was applicable well beyond our ideal customer profile. Was there any way we could sell access to this API without the usual sales and service requirements of vertical SaaS? (What technical founder doesn’t dream of software that sells itself?) Our new VP of People, Elana Silver, told me this was quintessentially The Innovator’s Dilemma (Clayton Christensen), which, embarrassingly, I had never read. According to Christensen, to a tech company, new innovations are either sustaining or disruptive, determined not by the nature of the technology but rather how it fits into a company’s existing go-to-market engine. AI is certainly both sustaining and disruptive, depending on how you use it and integrate it into a user experience. From our perspective, trying to sell an API service to an entirely new market is disruptive, while integrating that same service into quoting workflows is sustaining. In early phases of a startup, disruption is the goal. But now, we’re a “scaleup”. We can’t ignore disruptive innovations, but if our goal is to turn new technology into revenue and maintain our premium position in the market, the best strategy is to deeply integrate AI into the product.

At Jason Ray’s recommendation, I took down Chip War (Chris Miller) next. As an electrical engineer by training, I know a fair amount about how semiconductors operate and are manufactured, but it’s a vivid reminder just how reliant state-of-the-art chips are on ASML and TSMC, two foreign companies with a complete chokehold on computing. We truly live in a world shaped by infrastructure that most people never see or understand, and it’s becoming increasingly clear that, as a society, we’re struggling to meet challenges related to technology. That’s a topic I’d return to later in the year, but in the meantime, I needed to think about things that were actually within my control. The Big Leap (Gay Hendricks) found me at the right time, as I was struggling with procrastination and lack of direction, even while life was going well. There’s so much written about founders needing to be resilient in the face of rejection and failure, but even modest success can be uncomfortable. Hendricks details how people have an internal thermostat for how much success or happiness they can have and tend to self-sabotage when things get good. The solution to the “Upper Limit Problem” is to unapologetically operate in your “zone of genius” (not unlike Dan Sullivan’s “Unique Ability” framework).

Realizing I made it this far without reading a novel, I turned to Neuromancer (William Gibson), the original cyberpunk story, recommended to me by Chris Holt (“my writer friend”, as I refer to him in Boston). It’s a beautiful book. It makes you wonder: as those of us in technology work to expand the capability and reach of AI, to what extent is that self actualization and to what extent are we serving an exponential curve? We’re pouring our efforts into making these systems better and driving adoption, as the hype cycle demands. Increasingly, this work related to AI is driven by AI, as it helps us shape strategy, write code, prepare for and summarize meetings, and so on. Is it crazy to think of this as the beginning of AI being behind the wheel?

Summer

Wanting to tackle something epic (and after a few false starts) I tried Ulysses (James Joyce), since A Portrait of the Artist as a Young Man had a big impact on me in college. This is a long, toiling read, at times seeming to go nowhere and be about nothing. It doesn’t help that reading a thick book on a Kindle is deeply unsatisfying. Still, it was exciting to check in on Stephen Dedalus, and Joyce’s ability to capture stream-of-consciousness thoughts across diverse characters is amazing. I made it to chapter 15 of 18 (a 150-page episode in which a character is told to “talk away till you’re black in the face” and about which Joyce himself said “it will keep the professors busy for centuries”), but I have yet to finish the last three chapters. Going from a fun sci-fi paperback to one of the densest works in English literature was probably not the best idea. Still, it left me feeling undisciplined, and I needed a quick win. I reset with Willpower (Roy Baumeister): a light pop-psychology read, arguing that willpower isn’t a virtue, it’s a depletable resource that needs to be managed as such.

Fall

As my company started 2026 planning, I revisited a startup classic, which I don’t think I ever actually read cover-to-cover. Mike Breslin challenged our whole team to read (or re-read) Crossing the Chasm (Geoffrey Moore). It was originally published in 1991 but might as well have come out today. At Paperless Parts, we lived this book, getting stuck in the chasm in 2019 before unlocking our beachhead. Of course, we didn’t execute perfectly. One warning that really resonated was about sales strategy and the consequences of being too scattershot. It’s so hard to see these patterns while you’re living them. After talking with a co-worker about how well Moore holds up, he mentioned Good to Great (James Collins) as another classic. The lessons in here are timeless, although, having been published in 2001 right after the dot-com bust, there is some awkward handling of the internet that distracts today’s reader. Ironically, I also saw the news about Walgreens, one of Collins’s eleven “great” companies, selling to private equity while I was reading this.

Annual planning forces you to think about the bigger picture. The Coming Wave (Bhaskar & Suleyman) is on every AI thought leader’s reading list, and it exceeded expectations. AI technology is moving so fast that this book’s ability to project even two years into the future comes off as impressive (after all, it was published way back in 2023). Bhaskar and Suleyman paint a pretty depressing picture of our society’s likelihood of being able to “contain” AI by meeting the many challenges coming at us. Looking at the state of the world, is it credible to think we’ll enact sane regulation, do anything to prevent massive job loss, and ensure that the wealth produced by this technology is distributed equitably? This book is an emotional experience as it breaks down the huge changes that would be required.

For some closure, I had to zoom out even further. From 1935 to 1975, husband and wife Will and Ariel Durant wrote The Story of Civilization, an eleven-volume, 13,000-page history of humanity, none of which I have read. However, at the end of their careers, they published The Lessons of History (Durant & Durant), a short book of quick, essay-like chapters that compile the lessons they learned from a lifetime of study. I’ve never been so sure our current civilization is collapsing and never been less worried about it.

Just check out this quote from 1968:

“Democracy is today sounder than ever before. It has defended itself with courage and energy against the assaults of foreign dictatorship, and has not yielded to dictatorship at home. But if war continues to absorb and dominate it, or if the itch to rule the world requires a large military establishment and appropriation, the freedoms of democracy may one by one succumb to the discipline of arms and strife.

If race or class war divides us into hostile camps, changing political argument into blind hate, one side or the other may overturn the hustings with the rule of the sword. If our economy of freedom fails to distribute wealth as ably as it has created it, the road to dictatorship will be open to any man who can persuasively promise security to all; and a martial government, under whatever charming phrases, will engulf the democratic world.”

As the Durants tell it, collapse is normal and inevitable. “We should not be greatly disturbed by the fact that our civilization will die like any other,” are hard words to process as an elder-millennial raised with a sense of national pride and during several boring presidencies. Systems and governments turnover, power and wealth shift, and life goes on. A volatile world is risky, yet there are millions of people going about their routines just miles from war zones. What is an individual called to do on this planet with such rotational inertia?

“If a man is fortunate he will, before he dies, gather up as much as he can of his civilized heritage and transmit it to his children. And to his final breath, he will be grateful for this inexhaustible legacy, knowing that it is our nourishing mother and our lasting life.”

And that I find oddly motivational.