Let’s get started! First assignment:
Breathe. Really. We don’t breathe as much or as well as we should, and “AI” can get you a little out of breath!
Okay, but we have to do a little work, too. Find an LLM; if you have one already, find one that you’re less familiar with. Choose from:
A local LLM through LM Studio, Msty, ollama, etc. (This will require a little bit of computer work, installing an app, choosing a model, etc. We’ll cover this more in the coming weeks, so don’t feel obliged to start this yet.)
I believe those are all are free to start using, at least a bit.
Send me an email if you got stuck here and can’t proceed: kaminski@istori.com.
My opinionated feelings about the different models: Claude is better for most things, it’s “smarter”. ChatGPT or Perplexity are both fine, and better than Claude if you need web search or run into a topic that Claude is guardrailed against. Local LLMs are best if you don’t want incremental ongoing cost or if you need to not share your conversations with “the cloud”.
I didn’t mention Copilot or Gemini or anything else; I’m skeptical of them, and I haven’t found them as useful as the above (even though they offer some great stuff). But feel free to go around my opinions!
Okay, now some prompts to try, choose your difficulty level:
Easy (do one, get the answer, do the next):
Write a story about Tupence and Salud, two animal pals that get into amazing adventures and jams, but always manage to get out and recover with wit and pluck. End the story with a seed to start the next one.
How did Tupence get (his, her, their) name?
How did Salud get (his, her, their) name?
Write the next adventure.
Medium:
I’m planning to start doing “homework” on working with LLMs. Create a (one-week, or four-week) schedule for me that will help me get acquainted with various LLMs and how they work. 5 days on, 2 days off, 20 minutes per day, best if it’s not contiguous.
(Continue with more prompts, guiding the LLM to create an experience you’ll enjoy.)
Hard:
Start with this: I understand a data center take 1GW to run, and that many data centers are needed to support AI. Compare this with other modern industrial uses of energy. Justify why AI must or must not be allocated energy, given its use for other needs and the amount of carbon produced (and thence, climate change).
Drive the conversation forward, always examining your assumptions and the AI’s assertions. Personally, I would have this discussion more with ChatGPT or Perplexity (or maybe even Gemini) than Claude — bots that have Internet access. I generally like Claude’s reasoning best, but this conversation might benefit from web access, which Claude doesn’t do.
From the call, here is a download link for the PDF (and Markdown source, for Markdown geeks):
https://peterkaminski.wiki/_attachments/Peter_Kaminski_-_LBW_2025-03-19.pdf
https://peterkaminski.wiki/_attachments/Peter_Kaminski_-_LBW_2025-03-19.md.txt
I’ll close with today’s exclusive AI Homework image from my Midjourney work. I hope it sparks joy. See more at my Pixelthesia Bluesky feed.
All best,
Pete
Day 1 prompt and response:
What economic benefits per unit energy has AI generated to date?EditThe economic benefits per unit of energy that AI has demonstrably generated to date are more limited and nuanced than many projections suggest:
Realized vs. Projected Benefits:
While projections about AI's economic impact are abundant, concrete data on realized economic benefits specifically tied to energy consumption is sparse
Most major efficiency gains remain in early implementation phases rather than at scale
Documented Cases:
Google's DeepMind reduced data center cooling energy by 40%, saving approximately 10-15% of overall data center energy
Industrial applications like Siemens' AI for manufacturing optimization have reduced energy consumption by 10-20% in specific factories
Smart grid optimization using AI has improved efficiency by 3-5% in limited deployments
Economic Return Calculation Challenges:
The energy consumed during AI development (particularly training) is rarely factored into ROI calculations
Many companies report cost savings from AI implementations but don't measure against the full energy lifecycle of developing and deploying these systems
Current Reality:
Most enterprise AI deployments remain focused on revenue generation rather than measured efficiency improvements
The highest-volume AI uses today (consumer services, content generation, chatbots) have limited documented economic efficiency benefits per unit of energy
Research AI has generated scientific breakthroughs, but their economic impact remains largely prospective
Concrete Examples With Data:
Retail inventory management systems using AI have reduced waste by 20-30% in pilot programs
Precision agriculture AI systems have reduced water and fertilizer use by 15-30% while maintaining yields
AI-driven predictive maintenance has extended equipment life by 10-20% in industrial settings
The current situation suggests that while specific AI applications have demonstrated clear efficiency gains, the aggregate economic benefit per unit of energy consumed by the AI industry as a whole remains difficult to quantify definitively. The most promising areas are industrial optimization, resource management, and infrastructure efficiency rather than general-purpose consumer AI applications.
I'm a few days behind but I'm committed to catching up. The educational resources I've found tend to either be too technically focused on how LLMs work or too superficial to give me better insights on more practical applications beyond my daily superficial use of Chat GPT as a research and writing assistant. My ChatGPT queries about data centers were informative and thought-provoking. It appears that AI and Climate Change will drive an unprecedented scale of human transformation with both "horses out of the barn" and almost no mention of either in the last political cycle. ChatGPT proposed a very good and logical (to me) assertion that business, government and civil society must urgently collaborate on a few critical questions.