The latest in Intel entitled news:
A closeup of what Atom allows today:
Typo: ISSCC, not ISSC. https://hothardware.com/reviews/intel-details-nextgen-radios-solarpowered-cpus
Imagine having a seat at the table that develops laptops. If you're Samsung, Dell, HP, ASUS, Lenovo, you develop laptops. And yet all these laptop makers think making more powerful laptops is always better. To what extent is product design social conditioning?
"It's a pretty ugly table, guys." https://www.youtube.com/watch?v=XBzhk5eANTU Imagine being an OEM and offering a competitive product- a company willing to take risks, somehow has little time to explore alternative laptop designs.
Lots of OEMs and All-in-One makers that benefitted from setting a floor to pricing- they needed an excuse to keep the 6-cell battery in their laptops.
Intel provides the transistors, AUO/Asus/Samsung makes the bright LCDS/OLEDs, and Lithium Suppliers continue to make a large portion of sales from laptops.
https://www.blog.baldengineering.com/2024/10/intel-sets-record-with-2d-tmd.html seem like Intel finally disclosed some IP- probably because they expected to get funding for doing nothing. Now of course others are going to try to copy this. But at least the taxpayer isn't treated like a complete idiot and out of the loop.
Interesting to see Congress criticizing Intel. Here's an idea, Intel, restart your Quark series of processors. But this time, include more than 0KB of RAM!
https://www.msn.com/en-us/video/money/why-are-chip-competitors-trying-to-takeover-intel/vi-AA1rWvBZ As news-buzzy as these stories are, neither Qualcomm nor ARM are likely to be able to purchase any part of Intel, due to CHIPS contingencies: https://www.electronicsweekly.com/uncategorised/intel-to-get-chips-act-funds-by-year-end-but-must-not-sell-units-to-arm-qualcomm-2024-09/ https://www.tomshardware.com/tech-industry/intel-confirms-dollar3-billion-award-for-secure-enclave-18a-chips-coming-to-us-military
Yes, it has since been late 2020.
The companies I make fun of don't take solar power seriously enough, and its benefit to humanity. Therefore I don't take them seriously in equal proportion.
"Isaac Asimov supposedly once said “The most exciting phrase to hear in science, the one that heralds new discoveries, is not ‘Eureka!’ but ‘That’s funny…'” https://quoteinvestigator.com/2015/03/02/eureka-funny/#google_vignette
"A thematically related passage appeared in 1965 within the London periodical “Science Journal”. An article by British journalist Gordon Rattray Taylor discussed the process of scientific discovery:3
The popular picture of the scientist is of a man who is visited by a flash of insight and cries, in effect, “Eureka!” Or, more modestly, as a man who notices something others have ignored and mutters “That’s odd.”
To be realistic, however, the successful scientist often seizes on a new tool or a new technique and applies it sooner than anyone else."
I do not have an institutional position that represents expertise, therefore I am less obligated to make a tidy and neat project. That said, this page could be tidier. It's more a scattered collection of long ruminated thoughts rather than a completely consistent focus. That I joke about many things doesn't detract from the proven technologies demonstrated by others here (and in my own solar tests). I don't think this project would be possible without a little humor. For those that still don't understand the many contradictions here, it involves some cognitive dissonance, along with the need for abductive reasoning as a common form of analysis. I particularly think disbelief also has a lot to do with agnotology, a new word I just learned, but have understood for decades as the study of both willfull and unwitting ignorance/doubt.
Computer Science is deep. Physics is deeper. Your app probably isn't deep. Jack Handey wrote Deepest Thoughts: So Deep They Squeak in 1994. If there's plenty of humor at the bottom, I want to be there.
Another phrase I learned from the Opt-Out project is "epistemic literacy". Epistemic literacy requires basic knowledge resources- such as researched books and encyclopedias, rather than software that does acrobatic tricks with tens of hops in a traceroute with Chat-GPT. In other words, epistemic literacy requires basic computers (or in Gen-Z speak, based computers) for learning. In retrospect, it makes some of the previous criticism towards the OLPC seem a lot more "cruel pessimism", whereas the "cruel optimism" was a lot more epistemic and not as ahistorical as it seems...
Reason 1:
This project would be a joke if it weren't so sad. Academia, Industry, and the Government considers this project so low on the totem pole that if it's on the pole at all, it's under the visible portion. It took 10 years to put someone on the moon, and it's been 13 years since the Samsung NC 215S laptop was a product. So in a lot of ways, it's harder to put a solar panel on a laptop than it is to put a person on the moon or to explore the bottom of the sea.
Reason 2: Multitasking...Is not that important
The second reason, is that it is based on a prediction of computational trends called "Good enough computing" observed from Moore's Law around 2009: The basic observation that I will add, is that most computers, even a quad-core i3 from 2011, is good enough for 80-95% of a user's needs (provided a fast enough solid-state storage device). Not only that, but a user is, as much as a resume or CV is supposed to impress by stating otherwise, not a super multi-tasker. Yes, there are super multi-taskers, and multi-tasking is a skill that many people have, even quite well. But internet use, such as banking, reading, and scholarship are often done serially, in careful analaysis. Thus the limits to multi-core features in enhancing processing is becoming less necessary locally-both on a single mobile device- when it could be on another device in the home and "casting" or on the cloud (if the cloud hoster is trustworthy enough), and it is not removing its utility (fast i7, i9s and Ryzen 9s) altogether, nor should.
What it is emphasizing, however, is that a smartphone or laptop/monitor with sufficient display resolution- from 480p monitors to 1080p monitors in the mid 00s to 4k monitors in the mid 10s, whether it is a VR headset or goggle, is increasing the density of information for human interaction and productivity, often exceeding the minds ability to interpret, analyze and synthesize feedback. Thus multi-taskers' ability are plateuing for user-interfaces on hardware platforms that do not require that level of inefficiency on thin-clients. While I have criticized AR goggles and VR before, the point was never in absolute, but in the prioritization of chip's energy design and software that performs basic information for the markets without as/much access. Granted some of this technology wasn't even possible until a little more than a decade ago, but there is little reason to not prioritize and not optimize a user interface in a smalle power envelope today. Shortly after rockets were developed, the U.S. & the U.S.S.R wasted no time putting them to use...
https://www.youtube.com/watch?v=thZUMaGEE-8 "We can split the atom but not distinguish truth. Our information is failing us | Yuval Noah Harari" 10/18/2024
As computers becoming more powerful in smaller envelopes, one day there might be quad core or even octa-core processors running on microwatts of power and running 10 Chrome tabs, powered by body-heat, RF or solar power. To get there, though, one needs/really ought/should make a Nokia 6110 first on solar power.
Reason 3:
If you could identify a low-cost, low-risk, information bottleneck that could be commodified as a counter-intelligence operation, would you a: sell it to DARPA, b: do nothing but request a FOIA that takes years after the discovery, or C: educate the public?
Yes, but mainly as an unsuccessful product or concept.
It's partly inspired by the Apollo program to place someone one the moon by 1970. Also, Steve Jobs designed the 1984 Macintosh as a "computer for the rest of us," according to a 1990 Interview with WGBH Boston. He integrated various technology components that previously only were used in technical laboratories. Tim-Berner's Lee, in 1989, integrated many of the internet protocols that already existed- hypertext - with HTTP: ""In 1989, CERN was the largest Internet node in Europe and Berners-Lee saw an opportunity to join hypertext with the Internet: I just had to take the hypertext idea and connect it to the TCP and DNS ideas and—ta-da!—the World Wide Web. — Tim Berners-Lee[32]"
In essence, it is following in a long tradition of technological convergence, starting from the integrated circuit from Makimoto's wave: to even "intangible" things like protocols (e.g. HTTP/hypertext & TCP/IP) Source: Semi-engineering
Modified chart from above link
As you can see, this project is not all about computational efficiency at higher power levels. It is about computational efficiency at the lowest power levels possible to run in energy autarkic form factors.
Github's own ReadME project has a lot to say about that.
Also, Reduce first, then reuse second, THEN recycle. The misconception is often, buy without any long term strategy, dispose, and naively think someone else will sort it out. Chances are, the screen of your smartphone isn't cracked, and you're only upgrading it to get a faster processor. If you could mail your phone to a local center or drop it off to an upgrade center that upgrades the chip and memory perhaps or motherboard, you would reduce the cost of manufacturing new screens and cases, if only society believed in a standard mobile form factor option. Fairphone is its own ecosystem. One company alone won't be able to do it unless their de facto standard is popular. What's stopping you from adopting one?
There are more efficient processors out there, which will undoubtedly more practical for low-power, but the timeline to develop such products and develop middleware/emulation/application programming interfaces/ would require thousands, if not hundreds of thousands of coding hours, if it is ever prioritized at all. A simpler solution, for underdeveloped and emerging markets, is to stay the course of the most conventional architectures (ARM A720E on 2nm, for example would consume how much power at 200 Mhz and 16MB of RAM?). This would allow chipsets to be developed within a few years, rather than more than 5. Without an Artisanal license like those by Apple, Intel and Samsung, amateur chip designers aren't going to have access to the most efficient cores, unless an open source one is developed.
https://www.eenewseurope.com/en/intel-cpu-architects-leave-to-form-risc-v-startup/
https://www.reddit.com/r/hardware/comments/1f0j0kh/former_top_intel_cpu_architects_launch_ahead/ https://en.wikipedia.org/wiki/Traitorous_eight
Legacy Software Developers who have moved on to new platforms (or new paid work):
Vintage software salvagers/dumpster divers:
It's actually my fault. I got admitted to a university under a College of Liberal Arts and Science program, because I would not have stood a chance if I applied to the Engineering school, which I only had a tangential familiarity and interest at the time. I still haven't learned Linear Algebra and Differential Calculus II, and I do not have an M.S. or EE after my B.S. (though that didn't stop Steve Jobs from building anything, even without a B.A.)
It's also likely that industry is more interested in developing products that fit into nice, clean boxes- and claiming that wanting to meet climate goals for carbon emissions is easier by check all the boxes for meeting individual products's carbon footprint, but discourage Right to Repair product lifecycles, which collectively can re-use and re-pool parts to the second-hand market. Instead of trying to repair expensive products, it's better to develop a basic platform separate from all of that, one that doesn't create disposable hobby/edutech waste with incompatible form factors, but can be upgraded over time, piecemeal.
For Gross National Product, see this Investopedia for definition. There are other metrics of Growth, such as GNH, SPI, and HDI.
The most valuable exports, according to GDP, is Boeing airplanes, then cars (I think). That much of our economy is dependent on building new iPhones is not good for the environment. If I were in charge, I would put more people to work building cheaper, but quality homes, more jobs, a modern, fully renewable energy grid, more open source cars and body shops for open source cars (open source cars, which could also be exported to other countries (allowing the same quality of construction materials as the U.S), so that the things that make this country so desirable aren't sold only here. Instead there are students with $200,000+ in student loan debt working at Starbucks because academia has no work for them counting objects (in an early comic from the 00s) in a laboratory, which would be more productive than cars idling around a block for a drive through (when a dorm could have a cappuccino maker). These projects are far from related- but they represent society's fear of consumerist homogeneity (of material acquisitions and status symbols are important but have less to do with with computers unless one needs to own a Mac or have a Ferrrari logo on a laptop, and cultural homogeneity is less of an actual risk even with increased material homogeneity). Remember Bobo's in Paradise? Silicon Valley hasn't changed its wish to be bobos, but shifted the cost of living to planned-obsolescence and techno-feudalism, which contributes to inflation, of large things.
The short-term answer is no. The long answer, is it will save millions of consumers billions of dollars over the course of decades. It can result in mild revenue for component manufacturers, but less for vertical designs that are mutually exclusive. For a example a single board computer Foundation could make a tiny board that could fit inside a phone. They could build a SD sized chip, and Gigabyte, ASRock, MSI and Asus could build phone chassis with Innolux and AOC making displays. Established OEMS might benefit, but there is nothing stopping startups from doing so. New and used gas-powered vehicle sales (not usage) are being phased out in some countries and states by 2035. Does anyone plan to be using a cell phone in 2035, or some holographic kinect device that recognizes hand and neural telepathy? It's very likely that phones will still be around in 2035 in a similar form factor, and the market for flip phones and upgradeable chasses, while not huge, still represents an important part of communication infrastructure. The same applies to tablets and laptops. The motherboard is scalable.
Yes- two, and another one is currently being reviewed.
Also, check out these two articles:
https://www.experimental-history.com/p/the-rise-and-fall-of-peer-review
https://www.asimov.press/p/peer-review
There are a number of state-of-the art memory technologies, processors, low power displays and radios out there that can be integrated into a laptop listed here, which has been updated continuously since 2021. I do not know which combination of components work best, but the Law of Ergodicity suggests each permutation will eventually produce the right combination. I am kidding, of course- it requires an educated analysis to make the fewest number of guesses. Kinda of like a Rubik's cube? I am not a shape rotator- I am a word rotator. Yes, I made that up too. Few, if any companies are integrating them for users. If anything, they are prioritizing AI, because of "shiny thing syndrome." They believe that consumers have too many privileges, and think that solar powered computers are too slow to be worth trying. Or, it is being worked on, but they aren't telling anyone,and could try to milk it for all its worth once i3-i7 patents expire in 99 years (or and if so, will ensure profits for Intel well into the 22nd century) so why should consumers wait another 100 years before anyone realizes it's been possible since ~2011? A handful of academics and companies already have demonstrated it, but ask them one to too many questions and they will ignore it. In 1967, my late professor, Carl Woese was practicing Sanger RNA sequencing at a time when only a handful of people in the world knew the technique. Solar computing is not even a novelty. But you won't hear the media talk about it, because they have no problem airing a bully who says things like:
"They want electric planes. What happens if the sun isn't shining while you're up in the air?" https://substack.com/@introvertcomics/note/c-60467862
If a car can run on solar, a laptop can: https://www.youtube.com/watch?v=TYroDMTLVt4 (Not all charging needs to happen simultaneously with usage). Especially with cars, but less of an issue with laptops and phones. With tens of cell phone companies, it is curious why sunlight-readable screens like Pixel Qi could not have been saved by Economic Recovery Acts (CHIPS, TARP1&2) when Intel is being bailed out to produce chips domestically (it makes for good campaigning, but little for the end user).
(At least with a little organization and public support with an open source option- like like health insurance- there should be a public option for solar powered tech) Plus, I'm also weird. I also believe it's hip to be square. In 13 years since the Pixel Qi, it remains one of the few manufacturers to ever release a useful screen. The U.S. takes pride over options, but has a weak spot for Apple, Pixel and Samsung phones and those phone makers have never sought to offer a transflective display option. Anjan Katta (who I personally have interacted with, has a much more holistic vision of computing, one that does not treat the display as a second-class component: https://daylightcomputer.com/product His glare-free display is the first major innovation in displays in over 10 years. Why it took a startup to do that, when Intel, Microsoft, Dell, HP and Apple could not is highly questionable. Kudos to Anjan.
There are a lot of useful things AI can do, such as for robotics, information sorting & retrieval, and programming, but 10% of most AI VC funding could be diverted to other things and nothing of value would be lost.
There are already programs for that. Congress is spending over $280 Billion on the CHIPS act, and it's unclear how much of that is going nowhere.
"Intel’s investments are expected to create more than 10,000 company jobs and nearly 20,000 construction jobs, and to support more than 50,000 indirect jobs with suppliers and supporting industries."
However, from https://www.theverge.com/2024/8/1/24210656/intel-is-laying-off-over-10000-employees-and-will-cut-10-billion-in-costs :
"The chipmaker just announced it’s downsizing its workforce by over 15 percent as part of a new $10 billion cost savings plan for 2025, which will mean a headcount reduction of greater than 15,000 roles, Intel tells The Verge. The company currently employs over 125,000 workers, so layoffs could be as many as 19,000 people.
Intel will reduce its R&D and marketing spend by billions each year through 2026; it will reduce capital expenditures by more than 20 percent this year; it will restructure to “stop non-essential work,” and it’ll review “all active projects and equipment” to make sure it’s not spending too much."
10,000 jobs created, followed by 15,000 jobs lost = net staff of -5000 jobs. Plus R&D is cut. So there goes the Intel Claremont. Have a datacenter? Intel has no problem selling you its Dynamic Voltage Frequency Scaling technology in orders above 10,000. Create a single core, x86 processor that can run on 5mW? Intel is not interested in your pocket change.
A PBS video on Futurism, aired April, 2024:https://www.pbs.org/video/beyond-the-now-ft4j2u/
Ask yourself first: how many products you use and implicitly accept that are created from some previous inventor's idea of the future? Or, why are you still reading this?
"Ben Franklin Demonstrates the identity of Lightning and Electricity, from which he invented the lightning Rod.
https://youtu.be/cXlIZBZpkoA?t=1177 "Firing Line with William F. Buckley Jr.: Is the World Funny?
Hoover Institution Library & Archives 97.4K subscribers
966,423 views Jan 25, 2017 Episode 064, Recorded on July 7, 1967
Guest: Groucho Marx" (2 years before PBS- Groucho advocating for better programming)
On May 1, 1969, Fred Rogers appeared before the U.S. Senate Commerce Committee requesting funds to help support the growth of a new concept -- national public television.
"On May 1, 1969, Fred Rogers, host of the (then) recently nationally syndicated children's television series, Mister Rogers' Neighborhood (named Misterogers' Neighborhood at the time), testified before the Senate Committee on Commerce Subcommittee on Communications to defend $20 million in federal funding proposed for the newly formed non-profit Corporation for Public Broadcasting, which was at risk of being reduced to $10 million. Subcommittee chairman, Senator John Pastore (D-RI), unfamiliar with Fred Rogers, is initially abrasive toward him. Over the course of Rogers' 6 minutes of testimony, Pastore's demeanor gradually transitions to one of awe and admiration as Rogers speaks."
https://www.youtube.com/watch?v=fKy7ljRr0AA
1995 https://www.computerhistory.org/timeline/software-languages/#a2eed4bb9308a22315003de9062221ec
2006 https://www.computerhistory.org/timeline/computers/#169ebbe2ad45559efbc6eb357204d969
https://www.computerhistory.org/timeline/2012/
"Raspberry Pi computer
Raspberry Pi, a credit-card-size single board computer, is released as a tool to promote science education Computers Conceived in the UK by the Raspberry Pi Foundation, this credit card-sized computer features ease of use and simplicity making it highly popular with students and hobbyists. In October 2013, the one millionth Raspberry Pi was shipped. Only one month later, another one million Raspberry Pis were delivered. The Pi weighed only 45 grams and initially sold for only $25-$35 U.S. Dollars."
https://en.wikipedia.org/wiki/Intel_Quark
Quark powers the (now discontinued) Intel Galileo developer microcontroller board.[3] In 2016 Arduino released the Arduino 101 board that includes an Intel Quark SoC.[4][5] The CPU instruction set is, for most models, the same as a Pentium (P54C/i586) CPU.[6]
Intel announced the end-of-life of its Quark products in January 2019, with orders accepted until July 2019 and final shipments set for July 2022.[1][11]
November 30, 2022: The PCR-like amplification of the enshittification of the internet
ChatGPT (Chat Generative Pre-trained Transformer) is a chatbot developed by OpenAI and launched on November 30, 2022. Based on a large language model, it enables users to refine and steer a conversation towards a desired length, format, style, level of detail, and language. Successive prompts and replies, known as prompt engineering, are considered at each conversation stage as a context.[2]
By January 2023, it had become what was then the fastest-growing consumer software application in history, gaining over 100 million users and contributing to the growth of OpenAI's current valuation of $80 billion.[3][4]
Enshittification is a term coined by writer Cory Doctorow in November 2022 to describe a pattern of decreasing quality observed in online services and products such as Amazon, Facebook, Google Search, Twitter, Bandcamp, Reddit, Uber and Unity. The American Dialect Society selected the term as its 2023 Word of the Year. Doctorow has also used the term platform decay to describe the same concept.
History and definition An Audacious Plan to Halt the Internet's Enshittification by Cory Doctorow at DEF CON 31, 2023 The term enshittification was coined by Doctorow in a November 2022 blog post[1] that was republished in Locus in January 2023.[2] He expanded on the concept in another blog post,[3] which was republished in the January 2023 edition of Wired:[4]
In a 2024 article on ft.com, Doctorow extended the word with the term "enshittocene" to state that "'enshittification' is coming for absolutely everything".[5]
The beginning of an enshittification era necessitates a parallel internet infrastructure endeavor. Enter Minitel 2W. Local-first, internet where text and encyclopedias are prioritized over Advertisments, Algorithms, and antagonistic partisanship. One where feces is absent. Publically funded internet in the public interest.
https://www.thefp.com/p/npr-editor-how-npr-lost-americas-trust NPR applied for and accepted tax-payer dollars. "I’ve Been at NPR for 25 Years. Here’s How We Lost America’s Trust. Uri Berliner, a veteran at the public radio institution, says the network lost its way when it started telling listeners how to think. By Uri Berliner April 9, 2024"
For a long time, accepting tax-payer dollars was actually completely acceptable, because it attempted to be non-partisan.
Why not an internet advocacy group similar to NLNet in the U.S?
AUTHENTICITY CERTIFIED: Text version below transcribed directly from audio
Senator Pastore: Alright Rogers, you've got the floor.
Mr. Rogers: Senator Pastore, this is a philosophical statement and would take about ten minutes to read, so I'll not do that. One of the first things that a child learns in a healthy family is trust, and I trust what you have said that you will read this. It's very important to me. I care deeply about children.
Senator Pastore: Will it make you happy if you read it?
Mr. Rogers: I'd just like to talk about it, if it's alright. My first children's program was on WQED fifteen years ago, and its budget was $30. Now, with the help of the Sears-Roebuck Foundation and National Educational Television, as well as all of the affiliated stations -- each station pays to show our program. It's a unique kind of funding in educational television. With this help, now our program has a budget of $6000. It may sound like quite a difference, but $6000 pays for less than two minutes of cartoons.
AmericanRhetoric.com Transcription by Michael E. Eidenmuller Property of AmericanRhetoric.com Updated 12/16/21 Page 2
Two minutes of animated, what I sometimes say, bombardment. I'm very much concerned, as I know you are, about what's being delivered to our children in this country. And I've worked in the field of child development for six years now, trying to understand the inner needs of children. We deal with such things as -- as the inner drama of childhood. We don't have to bop somebody over the head to...make drama on the screen. We deal with such things as getting a haircut, or the feelings about brothers and sisters, and the kind of anger that arises in simple family situations. And we speak to it constructively. Senator Pastore: How long of a program is it?
Mr. Rogers: It's a half hour every day. Most channels schedule it in the noontime as well as in the evening. WETA here has scheduled it in the late afternoon.
Senator Pastore: Could we get a copy of this so that we can see it? Maybe not today, but I'd like to see the program. Mr. Rogers: I'd like very much for you to see it.
Senator Pastore: I'd like to see the program itself, or any one of them.
Mr. Rogers: We made a hundred programs for EEN, the Eastern Educational Network, and then when the money ran out, people in Boston and Pittsburgh and Chicago all came to the fore and said we've got to have more of this neighborhood expression of care. And this is what -- This is what I give. I give an expression of care every day to each child, to help him realize that he is unique. I end the program by saying, "You've made this day a special day, by just your being you. There's no person in the whole world like you, and I like you, just the way you are." And I feel that if we in public television can only make it clear that feelings are mentionable and manageable, we will have done a great service for mental health. I think that it's much hmore dramatic that two men could be working out their feelings of anger -- much more dramatic than showing something of gunfire. I'm constantly concerned about what our children are seeing, and for 15 years I have tried in this country and Canada, to present what I feel is a meaningful expression of care.
Senator Pastore: Do you narrate it?
Mr. Rogers: I'm the host, yes. And I do all the puppets and I write all the music, and I write all the scripts --
AmericanRhetoric.com Transcription by Michael E. Eidenmuller Property of AmericanRhetoric.com Updated 12/16/21 Page 3
Senator Pastore: Well, I'm supposed to be a pretty tough guy, and this is the first time I've had goose bumps for the last two days.
Mr. Rogers: Well, I'm grateful, not only for your goose bumps, but for your interest in -- in our kind of communication. Could I tell you the words of one of the songs, which I feel is very important?
Senator Pastore: Yes.
Mr. Rogers: This has to do with that good feeling of control which I feel that children need to know is there. And it starts out, "What do you do with the mad that you feel?" And that first line came straight from a child. I work with children doing puppets in -- in very personal communication with small groups: What do you do with the mad that you feel? When you feel so mad you could bite. When the whole wide world seems oh so wrong, and nothing you do seems very right. What do you do? Do you punch a bag? Do you pound some clay or some dough? Do you round up friends for a game of tag or see how fast you go? It's great to be able to stop when you've planned a thing that's wrong. And be able to do something else instead, and think this song -- 'I can stop when I want to. Can stop when I wish. Can stop, stop, stop anytime....And what a good feeling to feel like this! And know that the feeling is really mine. Know that there's something deep inside that helps us become what we can. For a girl can be someday a lady, and a boy can be someday a man.'
Senator Pastore: I think it's wonderful. I think it's wonderful. Looks like you just earned the 20 million dollars."
If it appears my pet project is vastly incomparable to a public national TV program that needs funding, I beg to differ. Today, public tax dollars are spent on vanity projects that serve special-interest groups without ensuring that the money is not wasted on largely redundant overhead (popups, bloated websites, horrible user-interfaces)
https://userinyerface.com/ A perfect exercise in discombobulation)
The need for a public, no-frills, no-nonsense internet pipeworks should have already been built yesterday. As George Carlin once said "As you swim the river of life, do the breast stroke. It helps to clear the turds from your path."
https://www.theintrinsicperspective.com/p/how-the-new-york-times-beat-me
https://www.youtube.com/watch?v=JPrAuF2f_oI Tom Lehrer - Pollution September, 1967
https://knowyourmeme.com/memes/but-can-it-run-crysis
2007: Crysis game released: But can it run Crysis? 2010: Quake II demo in browser "But can it run linux?". 2011: But can it run on solar? Intel Claremont runs on Windows 95. 2024: Can it run on solar and is it commercially available? No. 2034 : Can it run on Solar? Who knows? 2044: Can it run Crysis on Solar? Who knows, did they even get Solar linux running yet?
"About
But Can It Run Crysis? is a phrase referring to the 2007 Crytek first-person shooter Crysis, underscoring the reputation the game has obtained for it's steep system requirements at the time of its release. The phrase has slowly evolved into a snowclone as years pass, substituting "But can it run X?" for the most recent and system-demanding title at the time."
https://knowyourmeme.com/memes/snowclone
"About
Snowclones are a type of phrasal templates[2] in which certain words may be replaced with another to produce new variations with altered meanings, similar to the "fill-in-the-blank" game of Mad Libs. Although freeform parody of quotes from popular films, music and TV shows is a fairly common theme in Internet humor, snowclones usually adhere to a particular format or arrangement order which may be reduced down to a grammatical formula with one or more custom variables. They can be understood as the verbal or text-based form of photoshopped exploitables."
Solar Powered Computer is a Snowclone of "But can it run Crysis?" because it expands the template from game/program to operating system, as was done in 2010.
Mad Libs:
https://en.wikipedia.org/wiki/Mad_Libs
Mad Libs is a phrasal template word game created by Leonard Stern[1][2] and Roger Price.[3] It consists of one player prompting others for a list of words to substitute for blanks in a story before reading aloud. The game is frequently played as a party game or as a pastime.
The game was invented in the United States, and more than 110 million copies of Mad Libs books have been sold since the series was first published in 1958.[3]
Mad Libs was invented in 1953[4] by Leonard Stern and Roger Price. Stern and Price created the game, but could not agree on a name for their invention.[3] No name was chosen until five years later (1958), when Stern and Price were eating Eggs Benedict at a restaurant in New York City. While eating, the two overheard an argument at a neighboring table between a talent agent and an actor.[3] According to Price and Stern, during the overheard argument, the actor said that he wanted to "ad-lib" an upcoming interview. The agent, who clearly disagreed with the actor's suggestion, retorted that ad-libbing an interview would be "mad".[3] Stern and Price used that eavesdropped conversation to create, at length, the name "Mad Libs".[3] In 1958, the duo released the first book of Mad Libs, which resembled the earlier games[5] of consequences and exquisite corpse.
Stern was head writer and comedy director for The Steve Allen Show, and suggested to the show's host that guests be introduced using Mad Libs completed by the audience. Four days after an episode introduced "our guest NOUN, Bob Hope", bookstores sold out of Mad Libs books.[6]
https://en.wikipedia.org/wiki/Ad_libitum "Ad libitum This article needs additional citations for verification. Please help improve this article by adding citations to reliable sources. Unsourced material may be challenged and removed. Find sources: "Ad libitum" – news · newspapers · books · scholar · JSTOR (December 2021) (Learn how and when to remove this template message) In music and other performing arts, the phrase ad libitum (/ædˈlɪbɪtəm/; from Latin for 'at one's pleasure' or 'as you desire'), often shortened to "ad lib" (as an adjective or adverb) or "ad-lib" (as a verb or noun), refers to various forms of improvisation.
The roughly synonymous phrase a bene placito ('in accordance with [one's] good pleasure') is less common but, in its Italian form a piacere, has entered the musical lingua franca (see below).
The phrase "at liberty" is often associated mnemonically (because of the alliteration of the lib- syllable), although it is not the translation (there is no cognation between libitum and liber). Libido is the etymologically closer cognate known in English.
In biology and nutrition, the phrase is used to describe feeding without restriction.[1]"
"Other performing arts "Ad-lib" is used to describe individual moments during live theatre when an actor speaks through their character using words not found in the play's text. When the entire performance is predicated on spontaneous creation, the process is called improvisational theatre.
In film, the term ad-lib usually refers to the interpolation of unscripted material in an otherwise scripted performance. For example, in interviews, Dustin Hoffman says he ad-libbed the now famous line, "I'm walking here! I'm walking here!" as "Ratso" Rizzo in Midnight Cowboy (1969). While filming at a streetcorner, the scene was interrupted by a taxi driver. Hoffman wanted to say, "We're filming a movie here!", but stayed in character, allowing the take to be used.[2]"
On a tangent, the phrase was also in Forrest Gump: https://www.reddit.com/r/MovieDetails/comments/and66e/in_forrest_gump_1994_when_lt_dan_and_forrest/
3/11/2024- Update to my previous "origin" story. In June 2011, Liliputing posted an article titled, "Pixel Qi suggests low power tablets could be powered by 1W solar panels," where I commented on the original article. That link I posted was to a 6/2011 TechCrunch article, documenting an Industrial designer, "Andrea Ponti's Luce Solar Panel Powered PC." Luce in Italian means "Light." I don't claim to have been the first to have the idea for a solar powered laptop. I just want(ed) and (still) want the idea to come to fruition.
I first later conceived of this project after visiting and reviewing the Maker Faire in 2011, seeing both the Raspberry Pi and a booth with a kit for solar powerable electronics, solar power managers (similar to Adafruit kits now sold) called BootStrap Solar.
I wrote about it weeks later.
A year later I posted about it again in 2012.
I resumed interest in 2020 on the Raspberry Pi Forums after seeing the Solar Gameboy in the news:
This project then migrated across a few informal forum posts (EI2030-defunct, Raspberry Pi Forum), to Hackaday in 2021. From there, it has remained, although Github also provides an efficient mechanism for storing and and cloning projects.
[9-24-24 Note: the Hackaday.io server (not the .com site) occasionally has outages throughout the year- the last time was around 2-3 months ago. The project site's 500 error is:
If it appears, it might be intermittently accessible, or can return in a few days.]
On November 6th, 2022, I wrote a lengthy Substack post on the case for solar powered electronics, which I continued to edit, revise, and add source material well into 2023. In it, I addressed foundry space, and the term "pure play." I examined that while they are "pay to play," they are also potentially and profoundly unequal:
From a recent PC Gamer article (5/23/2024), "Even mega-companies such as Google and Qualcomm don't have the cash on hand to outbid big daddy Apple."
"According to 9to5mac (via Extreme tech), Apple's chief operating officer met with TSMC bosses to negotiate terms to secure 2nm capacity. The report cites 'local sources', so there needs to be a pinch of salt here, but given Apple's track record of locking up TSMC's capacity, this would certainly come as no surprise.
TSMC's 2nm process is currently scheduled to enter risk production in 2025, with volume production set for the second half of 2025. That means the upcoming iPhone 16 family with A18 chips will stick with 3nm, but it could mean the high end iPhone 17 Pro and Max could be the first in line to be built with 2nm technology.
The specifics of deals between TSMC and its customers are obviously not made public, but it is widely accepted that Apple booked all of TSMC's 3nm capacity, at least for a period of time. These chips, including the A17 and M3 families made their way into the latest generation iPhone 15 Pros and MacBooks. That would make Apple's 2nm play unsurprising."
My predictions were more than a year in advance. Would anyone want one company to reserve all the tooling at a company like TSMC? Hence the word foundry neutrality, similar to net neutrality. It's like when the FCC steps in to ensure a minimum wireless spectrum and internet bandwidth is reserved for for basic internet capability, some quantity of wafers should be reserved for startups to prevent a "too-big-to-fail" FAANG company from reserving 100% control of leading edge semiconductors, lest it be considered a monopoly (not so coincidentally, Apple is being sued by the DOJ for monopolizing the app market: https://www.justice.gov/opa/pr/justice-department-sues-apple-monopolizing-smartphone-markets It should not be hard to make the connection to hardware that Apple would want the same type of dominance. The tech media could cover that instead of Chat-GPT for once. It might sound like they are catching up to regulation news, after the CHIPS act led to new foundry investments in Arizona.
"it is widely accepted that Apple booked all of TSMC's 3nm capacity, at least for a period of time" - the definition of temporary reservations - "at least for a period of time," can be interpreted widely, because temporary booking of all a node's wafers in a year of production is different from a permanent booking. For example, while it is not technically a monopology on all leading edge foundries (which can include 3nm at Samsung), it still can delay progress in innovation when certain technologies require it. Are chipmakers utilizing 3nm for solar powered phones, and if so, what kind of memory are being used? It is likely that the nodes at 22nm and 12nm are still maturing for logic, SRAM and DRAM, thus there is some improvement that can be make in low-leakage processes such as TSMC-ULL/ULP and Global Foundries FD-SOI.
https://www.barrons.com/articles/apple-computer-chips-taiwan-trade-geopolitics-1605f116
"Your next iPhone won’t be stamped Made in America. But pry open the casing in 2025 and you may see semiconductor chips that were etched into silicon in the Arizona desert." If you're wondering why Apple may be doing business in Arizona soon, it's not because they are as American as Apple Pie. It's because they're being dragged, kicking and screaming, to comply with U.S. regulations (and a tax concession). For a company as callous to produce a hydraulic press ad, it should be a reflection of the steely heart that the company now is.
Though technically the concept of a solar powered computer existed as a solar calculator before that, which also influenced my interest, I can't think of any time before 2011 that I had wanted to solar power a device like a small PC, although I had been aware of the OLPC since the mid 00s.
In "A Conversation with Mary Lou Jepsen What’s behind that funky green machine?" (11/2007, ACM QUEUE), then CTO Jepsen describes the power consumption:
"It’s pretty hot in much of the developing world, so we’ve designed a laptop that can take extreme heat. Part of that is an artifact of it being so low powered. We don’t need big electrolytic capacitors whose lifetimes halve every 10 degrees hotter you get. We get to use little tiny capacitors because we’ve got so little power to deal with, and that’s quite helpful.
Also, half the kids in the world don’t have electricity at home. Half the kids. Eighty percent of the schools that we’re going into don’t have electricity. So we had to design a laptop that was also the infrastructure. It has mesh networking, which is the last mile, 10 miles, 100-mile Internet solution. The solar repeaters and active antennas that we’ve added into the mix cost about $10 a piece and help to relay the Internet. If one laptop in a village is connected to the Internet, they all are.
Yes, it might be just a trickle, a low-bandwidth connection from the Internet to the laptop, but between the laptops is a high-bandwidth connection through the mesh network. We use 802.11s, which is the standard for mesh. The standard isn’t actually complete, but we will be compatible with it when it’s completed. We’ve had to make it up as we go along, so we’re a little ahead of that. There’s truly so little power in the developing world. If a school is wired, it tends to be on a generator, and there’s one 60-watt light bulb per classroom. Generators make really weird power. Usually what comes out of the wall in most countries is 50 or 60 hertz, or somewhere in between. With generators, the frequency of the AC power can go down to 35 hertz. We therefore had to do really interesting power conditioning on the AC adapter. The laptop itself can take between negative 32 volts to 40 volts, and work well with anything from 11 to 18 volts. You can plug a car battery into it. You can plug a solar panel into it."
In early-2023, I recall listening to a audio interview with Richard Barbook https://www.youtube.com/watch?v=KyoxwUmQBns. Being quite wired to the maker community in the past 15 years, I did not really pay much attention to the political differences of Silicon Valley's origin and European constitutions:
"Unlike its American equivalent, the French revolution went beyond economic liberalism to popular democracy. Following the victory of the Jacobins over their liberal opponents in 1792, the democratic republic in France became the embodiment of the ‘General Will’. As such, the state was believed to defend the interests of all citizens, rather than just to protect the rights of individual property-owners. The discourse of French politics allows for collective action by the state to mitigate – or even remove – problems encountered by society. While the Californian Ideologues try to ignore the taxpayers’ dollars subsidising the development of hypermedia, the French government can openly intervene in this sector of the economy.46
Although its technology is now increasingly dated, the history of Minitel clearly refutes the anti-statist prejudices of the Californian Ideologues – and of the Bangemann committee. The digital future will be a hybrid of state intervention, capitalist entrepreneurship and DIY culture. Crucially, if the state can foster the development of hypermedia, conscious action could also be taken to prevent the emergence of the social apartheid between the ‘information rich’ and the ‘information poor’. By not leaving everything up to the vagaries of market forces, the EU and its member states could ensure that every citizen has the opportunity to be connected to a broadband fibre-optic network at the lowest possible price." https://networkcultures.org/wp-content/uploads/2015/10/0585-INC_NN10-totaal-RGB.pdf
It is within this context that I see an opportunity for solar powered mobile devices to become the 21st century upgrade from Minitel- a wireless, local-first, decentralized and distributed means to connect the information-poor global economy.
I'm not the best person to manage a project. There are far more qualified designers with knowledge about supercomputers from 60 years ago that has not yet even been optimized: https://www.crowdsupply.com/libre-risc-v/m-class/updates/modernising-1960s-computer-technology-learning-from-the-cdc-6600 (RISC-V is actually adopting this old Cray tech now that its utility is being realized)
That said, I have an eye towards user-experience. An engineer like Woz might be a brilliant engineer, but Jobs understood UI far better. What I see to fuse is the minimalist appearance of a laptop like the Luce, with the low power consumption of something like an Ambiq 510. The issue with microcontrollers is that, in using a Cortex M55, it is not designed as an application processor, and there is a scarcity of development in this field, at least public development.
Could anyone in the 1940s anticipate Rock and Roll of the 50s, counter-culture in the 60s, disco in the 70s, electronica/synthwave in the 80s, grunge in the 90's, indie-rock/metal in the 00's, and superpop in 10s? Market research can't predict what the next generation is doing, but there are sometimes hints that counter-culture skips a generation. It could be also what is to explain the rise of the flip phone generation, despite the plethora of octa-core options being given to them by the mega-conglomerates:
https://www.cnn.com/2023/01/15/business/flip-phone-gen-z-ctrp/index.html
https://mashable.com/article/gen-z-flip-phones-trend
Despite submitting several grant applications, this project exceeds the risk-taking of the Lower-Level Agents:
If there is skepticism from this project, then I am willing to sign a non-profit clause, in that I wish not to be funded, if someone else is willing to steer this project to completion. How else can I prove my sincerity?
Another issue, is the "The agent principal problem" : https://www.strangeloopcanon.com/p/the-agent-principal-problem
Amount of risk of this project (perceived or real):
The risks may be perceived or real risks. For example, do you believe solar powered computers are possible? If you believe yes, you are correct- solar powered calculators existed since 1976, and solar powered computers have been demonstrated since 2011. If you believe no, then you may not have been around solar powered calculators or computers in your youth. If smartphones were the only thing that one understood as "computers," then their view of history is quite limited. I was surprised, as you may be, to learn that many people who were interested in radios before computers were a lot like the hobbysists of today. https://en.wikipedia.org/wiki/Institute_of_Radio_Engineers
https://spectrum.ieee.org/ham-radio What has changed, however, is that a dependence on a single smartphone for all of one's interaction has caused a clear rejection of the platform by some in Gen Z, due to its near monopoly on attention. Radios and early desktop computers, at least, were offline and did not try to wedge themselves in to every aspect of life.
An important aspect in science literacy is understanding how things work. A solar calculator allowed a student to believe that a small amount of light could, in fact, power a calculator, using a static RCA 1802. In an era where students are all issued Chromebooks for calculators, they are not brought up in the same era where electricity is a privilege, rather than an expectation.
https://en.wikipedia.org/wiki/RCA_1802#Programming_languages
"But that doesn’t mean children of the ‘80s, ‘90s, and today are all rushing to make their houses as green as their calculators. Solar calculators may have convinced a generation that the sun could power gadgets, but it’s not the same as convincing them that the power in tiny calculators could stream from an outlet.
The problem, says professor and futurist Cindy Frewen, is that people don’t necessarily think of rooftop panels the same way they might think of consumer electronics. "People adopt their gadgets, but they accept their energy,” she says. “They take it for granted: ‘This is what I have in my house.'"
from: https://www.nationalgeographic.com/science/article/160225-solar-calculator-history-energy-objects
This repository is not an ideology, but a pursuit of ideas.
A great Substack post today mentioned Kranzberg's six laws of Technology:
"Technology is neither good nor bad; nor is it neutral.
Invention is the mother of necessity.
Technology comes in packages, big and small.
Although technology might be a prime element in many public issues, nontechnical factors take precedence in technology-policy decisions.
All history is relevant, but the history of technology is the most relevant.
Technology is a very human activity – and so is the history of technology."
https://en.wikipedia.org/wiki/Melvin_Kranzberg#Kranzberg's_laws_of_technology
http://pantaneto.co.uk/the-decline-of-unfettered-research-andrew-odlyzko/
https://en.wikipedia.org/wiki/The_Logic_of_Collective_Action
https://en.wikipedia.org/wiki/Collective_action_problem
"A collective action problem or social dilemma is a situation in which all individuals would be better off cooperating but fail to do so because of conflicting interests between individuals that discourage joint action.[1][2][3] The collective action problem has been addressed in political philosophy for centuries, but was most clearly established in 1965 in Mancur Olson's The Logic of Collective Action.
Problems arise when too many group members choose to pursue individual profit and immediate satisfaction rather than behave in the group's best long-term interests."
"David Hume provided another early and better-known interpretation of what is now called the collective action problem in his 1738 book A Treatise of Human Nature. Hume characterizes a collective action problem through his depiction of neighbors agreeing to drain a meadow"
https://stevejobsarchive.com/exhibits/objects-of-our-life (1983 International Design Conference in Aspen, Colorado)
https://stevejobsarchive.com/book/download (EPUB, 32MB)
"Of all the great companies of recent memory, there is only one that seemed to have no character, but only an attitude, a style, a collection of mannerisms. It constructed a brilliant simulacrum of character, in the way a man without empathy or conscience can pretend to have those traits. But it was never really there--even though two generations of employees convinced themselves otherwise. It was only when that character was finally tested did the essential hollowness of the enterprise finally stand exposed, and the employees and customers shrieked with betrayal.
This was Apple Computer Inc., and there has never been a company like it. It was founded by two young men, one a genius with no allegiance to any institution but his own mind; the other a protean, inconstant figure who seemed composed of nothing but charm and a pure will to power. The company they built seemed to have everything: great technology, superb products, talented employees, rabidly loyal customers, an arresting vision, even a lock on the zeitgeist. But, like its founders, it lacked character. And because of that, from the first minute of the first meeting of Steve Jobs and Steve Wozniak, a decade before the company's founding, Apple Computer was set on a path from which it could not escape, even after those founders were gone. And that path would in time lead to the company's destruction.
More than any other great company, the seeds of Apple's future glory and its subsequent humiliation were planted long before the company ever began. And bend and prune as it might, Apple Computer could never free itself of its roots.
2.0 SEED
It was Regis McKenna, the Silicon Valley marketing guru, who first saw the horrible truth: "The mistake everyone makes is assuming that Apple is a real company. But it is not. It never has been." He was too much of a businessman (after all, Apple could prove to be a once and future client) to draw the final inference: "And it never will be."
Nobody alive knew Apple better than Regis McKenna--at least nobody who had been affected by the notorious "reality distortion field" that emanated from Steve Jobs;. After fifteen years of handling the company, Regis had remained unwarped and unconverted because no matter how successful Regis McKenna Inc. had become, and no matter how far it had left the publicity game behind for the more rarefied climes of marketing and business development, Regis still remained a PR man at heart. He still upheld the flack's first law: never, ever believe the hot air you put out about your client.
Not that it was easy. When you watched your client land on the cover of Time magazine and knew you got the credit for getting him there; when you stood in the convention centers and giant auditoriums and felt the waves of adoration rolling around you; and when the calls came late at night and you heard Jobs the Seducer telling you how much he depended upon you, it would have been so simple to surrender to the undertow, to lose yourself in the Apple Will.
But every time Regis thought of doing so there would be a meeting to remind him that Apple was a kind of collective madness. He would bring in an expert on marketing, or branding, or organizational theory--anything that might give the company some order, some strategic planning, some simulation of real business discipline--and he would watch in dismay as that person was humiliated, ignored or driven away. As for his own advice--well, nobody blew off the mighty Regis McKenna. Instead, they'd listen intently, nodding, foreheads pinched in concentration, even the seraphic Jobs himself making those unreadable and delicate motions with his fingers on the tabletop as if he was taking seriously what Regis was saying ... and then the Apple Corps would leave the room and never think about Regis's message again.
In the end, after fifteen years advising the company he'd helped to create, McKenna walked away from Apple Computer. And not just from Apple, but from PR itself"
-Excerpt from the 1999 book "Infinite Loop How the World's Most Insanely Great Computer Company Went Insane" by Michael S. Malone
https://multicores.org/lisa/VCFB2023.pdf (great history on the early hardware of the Lisa and Apples)
https://youtube.com/watch?v=0OlEHIAIWn4 "No experience in life is wasted as an artist if remembered and used later"
by that logic, Steve Jobs was an artist - calligraphy in the Macintosh.
"Reed College at that time offered perhaps the best calligraphy instruction in the country. Throughout the campus every poster, every label on every drawer, was beautifully hand calligraphed. Because I had dropped out and didn’t have to take the normal classes, I decided to take a calligraphy class to learn how to do this. I learned about serif and sans serif typefaces, about varying the amount of space between different letter combinations, about what makes great typography great. It was beautiful, historical, artistically subtle in a way that science can’t capture, and I found it fascinating.
None of this had even a hope of any practical application in my life. But 10 years later, when we were designing the first Macintosh computer, it all came back to me. And we designed it all into the Mac. It was the first computer with beautiful typography. If I had never dropped in on that single course in college, the Mac would have never had multiple typefaces or proportionally spaced fonts. And since Windows just copied the Mac, it’s likely that no personal computer would have them. If I had never dropped out, I would have never dropped in on this calligraphy class, and personal computers might not have the wonderful typography that they do. Of course it was impossible to connect the dots looking forward when I was in college. But it was very, very clear looking backward 10 years later."
https://www.youtube.com/watch?v=b9_Vh9h3Ohw Springboard: the secret history of the first real smartphone (Full Documentary)
https://litverse.substack.com/p/steve-jobs-vs-the-haters
https://www.directors-institute.com/post/whystevejobswasfiredbyapple
https://newsteve.substack.com/p/most-ideas-come-from-previous-ideas My recent writeup with a tangent into Biology
"Looking back, some have questioned whether Apple's board of directors could have done more to retain Steve Jobs in 1985. Sculley himself later acknowledged Jobs' effective leadership and called him "the best CEO ever," admitting that he underestimated Jobs' visionary potential at the time. The firing of Jobs also raises questions about the board's decision-making and overall company strategy. Could they have chosen different approaches or members to better handle the situation?"
Imagine an open source project, that, uses a unified design until each milestone is complete. Job's logic here could be like the Cathedreal in the Bazaar. The Bazaar houses the Cathedral, but temporarily cannot access the Cathedral.
"Certainly all historical experience confirms the truth - that man would not have attained the possible unless time and again he had reached out for the impossible." -Max Weber
"No worthwhile human achievement has ever been instigated on the basis of demonstrable cost effectiveness." - Adrian Bowyer
My Github draft got lost because I accidentally cancelled a tab closing and undid a nearly complete commit, but I had written a substantial paragraph on how cost is a secondary consideration in the progression of inventions, citing Bowyer, but also analyzing Xerox's Chuck Thacker and Apple's Woz "middle stage" adaption of IBM/PDP (e.g. 945/360, PDP-6) systems into novelties- portable Compaq-like consumer tech (though they weren't the only ones- as Landley.net has a better history on that- he mentions Paul Allen and Mitch Kapor, which I read much less into, but that they had identified that complexity was no longer the only drafting stage.):
"The bestselling computer in the world was the PDP-8 from 1973 until it was displaced by the Apple II around 1979, and in its entire production run the PDP-8 sold a grand total of around fifty thousand units EVER, meaning there was no consumer base for a "software industry" before microcomputers. Most software before that was either produced by hardware manufacturers bundling software with the hardware they sold, or by local staff maintaining an installation, or collaborations like produced Multics. What little commercial software got created was bespoke development tailored to specific installations because there was no other business model yet due to a lack of customers to sell to. (Not a lot of speculative development when your total potential worldwide market for PDP-6 software was 23 machines, you talk to them FIRST and get paid before putting in the engineering time, and then you DO the work on their hardware because you haven't got one.) The first computer to sell a million units was the Commodore VIC 20 at the end of 1982, and "the computer" was Time's man of the year for 1982. The Apple vs Franklin legal battle happened when it did because a shrinkwrap software finally had a potential customer base THAT YEAR. People fought over the money once there actually was money."
To focus on two engineers:
" The obvious difficulty in this method--the only method the engineering world had known until recently--was that the more complex the problem, the more complicated the hardware setup needed to address it. In this world, the most gifted engineers were those who could puzzle out novel ways to reduce the number of components by, say, 10 percent. And it was in this particular type of simplification that Woz had shown almost supernatural talent." from Infinite Loop (1999) by Michael Malone https://archive.nytimes.com/www.nytimes.com/books/first/m/malone-loop.html
from The Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age Paperback – (1999) by Michael A. Hiltzik (Author)
A careful review shows that they both understood that cost reduction was the future of Silcon Valley (and since PARC was primarily focused on 10-year long research & dev, the marketability of the Xerox Star would arrive too late, thus Jobs accelerated development by completing their menu screen and cost reduction, as stated by Jobs in an interview in 1990 to the WGBH Boston: https://www.youtube.com/watch?v=L40B08nWoMk
To tie this back to the hypothesis stage, the phrase "talk is cheap" isn't to devalue talk, but to acknowledge that cost is an important factor in even the exploratory stage. And therefore "talk is cheap" is another way of rapidly prototyping (hence Bowyer's RepRap concept) various iterations of expensive ideas in a way that doesn't require actually making them all, but rather exchanging them either via market surveys/research, forums, or prior product feedback to discover market or business inefficiencies in both availability of products and underserved markets which sometimes actually have viral, general purpose applications after passing through the Gartner hype cycle. In other words, "Think Rich" is not delusions of grandeur, but it's an inalienable right to discuss concepts and products as ideas cannot be patented, something that creative people do not often have a shortage of (and sometimes are better off working on multiple projects if they have writers block/engineer's block)
The cost of prototypes, toys, and research, are, according to Byrne Hobart, "But there's a feature of the outside world that also has a major impact: solutions-in-search-of-a-problem and toys-in-search-of-real-world-use are both less costly and more valuable in a world of lower real interest rates." (A Solution in Search of a Problem" is a Low-Rates Phenomenon 12/12/20222)
I had written something in my draft about I/O shields, and how many new SBC boards such as Raspberry Pi, Orange Pi and even AMD, could share more similar form factors to support even long-side backplates (as in AMD's case) if they developed something like a mini i/o shield (1/6 scale perhaps).
from https://www.aaeon.com/en/p/pico-itx-boards-pico-apl3-semi (smallest case with standard form factor- pico ITX. With Raspberry Pi, you are at the manufacturer's whims each generation if they change up the design)
Other mini-ITX cases with flexible backplate slots:
This little "cosmetic" detail could support more reusability and dual-use integration into laptops such as the PiTop, which do not require major modification of chassis if the rectangular I/O shield can be limited to a narrow segment of the chassis. As mentioned in my mobile-ITX blog post, degrees of freedom are a chemistry concept that chemist know more about, and architects of SBCs have far more leeway into rearranging components that do not require convoluted overpasses of wires above heatsinks to reach the backplate I/O.
"The cubane name derives from the cube-shaped geometry of the molecule. Since carbon normally bonds at angles of 109.5 degrees, the forced 90-degree angles of the cube framework introduce a high degree of strain into the molecule—so much so that prior to Eaton’s seminal synthesis, most chemists and theoreticians deemed the very existence of the molecule impossible.
“Not only did Phil synthesize cubane, but he did so by a very creative strategy that used photochemistry to excite the molecule into a cage structure and a ring contracting reaction to attain the desired carbon framework,” said Rawal."
https://news.uchicago.edu/story/philip-eaton-renowned-chemist-and-founder-cubane-1936-2023
I think it's important to not understate my early influences. When I was an undergraduate majoring In Biology, my introductory organic chemistry class had a lecture on synthesis of chemical structures. "“To this day, it’s a landmark. If you look up a textbook on organic synthesis, Eaton’s cubane synthesis will be showcased,” said Prof. Viresh Rawal, Eaton’s colleague and chair of the UChicago Department of Chemistry. “It is often used to demonstrate the power of chemical synthesis and the ingenuity that such molecules inspire.”'
As I reflect on my research interests, I can't help but think of comparing the relatively boring field of chemistry to the hot field of semiconductors (it's obvious who gets all the press).
“Phil said one thing to me that I remember to this day,” said Chuan He, Eaton’s colleague and the John T. Wilson Distinguished Service Professor of Chemistry. “He said: ‘So many people work on natural products; I decided to work on unnatural products.’ I think that captures the essence of the University of Chicago. We strive to work on things that are different, unique or sometimes unpopular.”
Semiconductors are unnatural products, so what difference does it make on which types of electronics are paired together?
If you browse though all my repositories, you may notice a pattern.
All this time, I've been trying to "pack" efficient components- be it pre-built linux distros such as DietPi into 256MB RAM or hardware (currently in the concept stage) into a box- or you could say a cube. But this cube is a circuit design, for a PCB.
The atoms C (Carbon), N (Nitrogen) represent components. The traditional PCB sees power input on a 2D planar- (not in the literal sense, but figurative). A flat compound is cyclohexene (of cycloalkenes). Consider the aromatic double bond the circuit where electricity flows. This is considered the "traditional" PCB. Now, the era of 3D stacked memory is popular. PCBs have always had layers (2, 4, 6, etc), but power is usually viewed in a 2-dimensional plane (in that it is a given that it is an input, but rarely makes any consideration into the source of that power, and what type of PCB is needed for it to work.
from https://github.com/TUDSSL/ENGAGE#system-design-game-boy-emulation-and-system-state-checkpointing
In other words, the design of engineering is capable of thinking in 3D terms when it comes to memory and CPU (e.g. Ryzen 7 5800X3D), so why not solar power integration? Energy harvesting shifts the utility of the design into what can be perceived as having to "work" to generate power, because holding a tablet or phone to collect sunlight would appear to be a "chore" for the consumer. But that is not really a universal belief.
Some manufacturers are "subtly" including solar charging into products again:
Most technological development involves some amount of efficiency advancements for it to be marketable. What this project seeks to do, is integrate all of those highly efficient technologies into one "cube." You can call it disruptive, in the same way cubane can be considered disruptive.
"The resulting high energy density means a large amount of energy can be stored in a comparably smaller amount of space, an important consideration for applications in fuel storage and energy transport."
By integrating a high efficiency solar DC-DC charger, such as the TI BQ24074: "Automatic charging current tracking for high efficiency use of any wattage solar panel"...
...and the most efficient microcontroller in terms of microamp per megahertz (uA/mhz), the Ambiq Apollo4, you have a very compact "box." One that can be shipped.
Step 1: Design Box
Step 2: ???
Step 3: Ship Box. Profit $$$
Carbon, Meet Nitrogen (or hydrogen, take your pick). I didn't invent this pairing. Countless others before me actually built a working prototype. I'm just explaining the trend. Now a cube requires 10 bonds (plus 8 more for H) (C8H8). To ship this "box", it needs a display. Enter memory in pixel. For it to be ubiquitous, it probably needs a long-range modem, such as LoRa or nB-IoT.
So far that's only 4. But enough for a diagram:
It just so "happens" that Moore's Koomey's Law has progressed to a stage where the amount of power on a small solar panel is enough to power all four of those components. Powering a keyboard, mouse/touchscreen, and/or voice recognition are additional challenges, but the basic circuit has been described.
The bi-directional flow of the electricity in a cubic circuit isn't meant to be taken literally- it would need to follow the standard solar charging circuit designs (or anyother energy harvesting circuit).
If you think this project impossible, then it's like saying cubane is impossible.
On a side note, the phrase "be there or be square" originated in the 1940s:
"The sense of square as a derogatory reference to someone conventional or old-fashioned dates to the jazz scene of the 1940s; the first known reference is from 1944. There it applied to someone who failed to appreciate the medium of jazz, or more broadly, someone whose tastes were out of date and out of touch."
So there you have it - cubane is to jazz as flat pcbs are to "being square".
Porting linux or BSD to a microcontroller would require a lot of effort, yet it seems the path of least resistance when compared to trying to develop a sub 30nm application processor designed for extremely low power consumption (less than 2mW. With the Apollo4 Plus running at 4uA/mhz, that presents a fast enough processor for basic applications while still retaining a low power profile.
One of the first goals is to select an OS that would be versatile to basic, low-RAM applications. Another step would be to have a bootloader, via buildroot or Yocto development. a third step would be to run the current application only in RAM.
"Its embedded 4.75MB of memory delivers power-efficient display performance by storing images on-chip to avoid exhausting resources by fetching data from external memory.' https://embeddedcomputing.com/technology/iot/wireless-sensor-networks/ambiq-enables-audio-radio-and-graphics-for-always-connected-iot-endpoints
One of the key differences from this project and the Raspberry Pi is that micro-SD loaded OSes like Raspberry Pi OS are extremely slow, and do not compare to the true, baremetal capabilities of even the armv6 in the Raspberry Pi Zero. loading Raspup Buster 8.2.1 or Tinycore linux on a Raspberry Pi 3 on 512 MB RAM runs faster on RAM than loading apps from microSD.
The early Nokia phones had immediate response times when navigating the Symbian OS. The modern operating systems of today adopt an omnibus of kernel modules, that prevent replicating the user experience of earlier phones and desktops. Phones attempt to run at even faster speeds to keep up with the hundreds of processes, yet lag behind the simplified OS of a generation ago.
What would be ideal is to focus efforts on operating systems that feature HMI and userspace applications while retaining the benefits of the RTOS task scheduling. An analogy would be the UDP vs TCP comparison. UDP is a connectionless protocol- it does not require connections to restart after losing a byte to a corrupt transmission packet. Rather than have hundreds of process IDS waiting to be completed, the user could wait for a kernel task to complete, and if not busy, the syscall could complete the user's request. I call it the "use it or lose it" concept. Which the kernel would be programmed to skip a task that would take too long to process. It would be better to have real time notifications of kernel tasks (likea sysmonitor) rather than an hourglass (or no indicator at all) to observe system health- this would significantly benefit microcontrollers that have limited RAM (2-4MB usable).
Another focus is to perhaps limit the amount of POSIX compliance in an OS to further reduce OS size.
A third focus is compatibility with plug and play screens that do not emit a backlight, to save power and to reduce eyestrain. A reflective display with a variable refresh rate- 1-30hz, toggled by the user, would be a desirable feature in such a system.
The addition of solar panels is would certainly be a goal, but would need to be implemented at a later stage.
Low-power-E-Paper-OS Working Group
=====================
Name: Low-power E-Paper OS
Objective: The goal of this project is to run an OS on an ultra low-power CPU/MCU that can output terminal or a window manager to an e-paper display.
Members: hatonthecat, @alexsotodev open to new members, including after project started.
https://alexsoto.dev/static/community-built-eink-laptop-project/slides.pdf (Slides 45-49)
https://ei2030.zulipchat.com/register/
Hardware: Redboard Artemis, SAMD51, Dialog 14695, ESP32, STM32, other MCUs/MPUs with can be used, including ones with E-paper already connected, like M5Paper, or LILYGO® TTGO T5.
Looking for:
"At this point, we need other people interested in the idea of this super-low-power device with its “I’m not a regular laptop” aspiration. You can help us figure out the key important areas for us to focus on. No special skills needed! If you like the sound of such a device, email us and join! No required time-commitment and no contribution is too small and no worries of all of this microcontroller-type talk goes over your head. We need you!"
If you do have experience (and it is welcome), these fields are of particular help:
"Embedded Development
Electrical Engineering
Software Developers
Reverse-Engineering
Writers/Researchers"
This may sound like a purely high-tech project (read: hard/inaccessible), but it is not, since leading edge hardware/tech is now available in the open-source community. The Sparkfun company (which makes the Artemis boards with Apollo3) helped changed that perception, because if NASA used Sparkfun's altimeter, then it means open-source tech can (and has been) designed to be both leading tech and accessible. Other technology companies like Adafruit, Digikey & Mouser also provide high-quality electronics to non-business entities (i.e hobbyists).
Technology that optimizes low-power cpus for both sleep and active modes has a benefit for not only wearables and IoT, but also user interface applications. This avenue (Cortex A series processors, as opposed to Cortex M0 or M4) has rarely been developed for, due to the likely power needs of the most performance-hungry apps.
Items: Microcontrollers, e-paper display (no minimum resolution, but should be capable of displaying terminal)
For a microcontroller, microprocessor, display, external memory/storage, and OS to be considered for this project, it must meet the Power First-Design approach: [https://github.com/EI2030/Low-power-E-Paper-OS/blob/master/wiki/tri-design-approach.md] (Link reverted to prevent broken links).
Non-Code contributors welcome: https://github.com/readme/featured/open-source-non-code-contributions
Some interesting new processor finds:
https://perceive.io/product/ergo/
https://www.st.com/resource/en/datasheet/stm32u585ai.pdf
Some very early low power microcontrollers that catalogued back in 2011 and forgot about:
https://elinux.org/RaspberryPi_Laptop
https://www.radiolocman.com/news/new.html?di=64911 https://en.wikipedia.org/wiki/EFM32
Ambiq Update, March 2023 (p.10)
A newsletter at Substack has a great piece on the early days of Xerox PARC, called ChipLetter: https://thechipletter.substack.com/p/chip-letter-links-no-21-xerox-parc Part of it is paywalled, but there is substantial content, including a reference to LA Times writer Michael Hiltzik in his 1999 book, "The Dealers of Lightning". Written just 1 year before the dot com bubble burst, a biography on a company with such unfettered access to frank employee interviews today would be highly unusual, as brands have many more trade secrets to protect in an ultra-competitive market. Nonetheless, the book reads with a much more activist tone than the fluff today in the Wirecutter section of the NYT. https://en.wikipedia.org/wiki/PARC_(company)#The_GUI
3-4-2024
https://albertcory50.substack.com/p/bring-back-private-offices An interesting story on the myth of open (collaboration) spaces. Covers Xerox and Bell Labs offices.
https://www.404media.co/elon-musk-tweeted-a-thing/ "If we want journalism to survive we need to move away from the model where dozens of humans write the same exact blog about an errant Elon Musk tweet in hopes of appeasing an algorithm that is actively changing to kill this exact business model" -Jason Koebler
Of the few human journalists remaining at major news outlets, they are forced between choosing to cover thinly veneered promotional reviews while the benefiting from association with an established institution name/Trademark like the NYT or Wired (some would call it "legacy media," which is admittedly offensive even to myself), which might have access to some advanced SEO tools (or would be foolish for not trying), and covering a story that is not even on the algorithmic radar. Someone could discover then next gen fusion reactor, and tech wouldn't cover it because they wouldn't be able to explain it to their audience (but have no problem using the phrase mansplainer and tech bro when ridiculing Silicon Valley). So which will it be, BigMedia, educating readers about basic science concepts or debating the minor details about which new product you should or should not buy (while plugging in nearly every model of the company you're covering).
Hobbits engaging in hobbyism (smelting a ring in a volcano to prevent Smeagol from finagling)
Several years ago, an article which went virtually unnoticed still rings true https://www.nationalgeographic.com/science/article/160225-solar-calculator-history-energy-objects Children from the silent generation grew up with Erector sets as a hobby. Before the internet, tinkerers from the baby boom era spent plenty of time working with antennas to build radios out of transistors. The Generation X and early Millenials are probably the only two generations that grew up with pocket-sized calculators, which cost more than $400 in the early 1970s. By the late-90s, cell phones began integrating all of these features into the first multimedia smartphones, and the ease of access to computing made hobbyism less about access to basic information and more about high-speed social media.
A little bit of a handicap encourages each generation not to become super dependent on convenience. Call it the most basic pc. My first PC came with a CD pack that included an Encyclopedia (MS Encarta '95). Before that, home bookshelves had encyclopedias in many homes. Portable computers are even more capable of being multi-featured PCs in even less of a thermal design power/envelope (TDP), and the focus of journalism has often been "we're covering all the new features of a product release- let us educate you, and throw away your old pc (we wouldn't know what to do with it- linux? what's that?"
Hyperbole asides, A balanced tech media like the Wired, The Verge, Ars Technica, ExtremeTech would cover 10% software, 10% hardware, 10% Superconductors, 10% particle physics, 10% climatology, 10% astrophysics, 10% biology, 10% culture/politics, 10% business, and 10% pure mathematics (you can see where my bias leans, some of this was humor-intended). Instead it's 95% leading FAANG company releasing incremental product/feature XYZ.