Technologies
Bill Gates Has Published the Original Microsoft Source Code
It’s «the coolest code I’ve ever written,» the Microsoft co-founder says.

If you want to see the original source code that started Microsoft, Bill Gates is now sharing it. On Wednesday, the Microsoft co-founder posted it on his Gates Notes blog, reminiscing about the company’s early days for its 50th anniversary. Gates has written plenty of code in those five decades but he called this «the coolest code I’ve ever written.»
Sharing a photo of himself holding a huge pile of paper showing the code, Gates wrote that he was inspired by the January 1975 copy of Popular Electronics magazine. The magazine had featured a cover photo of an Altair 8800, a groundbreaking personal computer created by a small company called MITS.
The 19-year-old Gates and his Harvard pal Paul Allen reached out to Altair’s creators and told them they had a version of the programming language BASIC for the chip that the Altair 8800 ran on. Such software would let people program the Altair.
«There was just one problem,» Gates wrote. «We didn’t.»
Micro-Soft is born
Gates said he and friends «coded day and night for two months to create the software we said already existed.» Gates and Allen then presented the code to the president of MITS, who agreed to license the software. «Altair BASIC became the first product of our new company, which we decided to call Micro-Soft,» Gates wrote. «We later dropped the hyphen.»
And the rest, as they say, is software history. You can download that 50-year-old code from Gates’s post. «Computer programming has come a long way over the last 50 years, but I’m still super proud of how it turned out,» he wrote.
Read more: Best 16 Xbox Games Right Now
Melinda Gates: new book
Also making headlines this week was Gates’s former wife, Melinda French Gates, whose new book, The Next Day, comes out April 15. As that date approaches, she’s opening up about the end of her marriage to Gates.
The couple divorced in 2021 after 27 years and three children. According to People magazine, Melinda French Gates wrote in the book that in 2019 she was «having nightmares about a beautiful house collapsing all around her — and then waking up in a panic night after night.»
She acknowledged what Bill Gates has publicly stated — that he wasn’t always faithful in the marriage — and said she was also disturbed by Gates’s meetings with child sex offender Jeffrey Epstein. Bill Gates has since said he regrets meeting Epstein.
Melinda French Gates said her bad dreams would eventually change into images of her family on the edge of a cliff where she «plummeted» into a void. «I knew, in that moment, that I was going to have to make a decision — and that I was going to have to make it by myself,» she wrote, according to the People article.
Technologies
Want New iPhone Controls? Here’s the Latest From iOS 18.4
One control in particular brings Apple’s Visual Intelligence to more iPhones.

Apple released iOS 18.4 on March 31, and the update brought bug fixes, new emoji and a new recipes section in Apple News to all iPhones. The update also brought a handful of new controls to your iPhone’s Control Center, including a control that brings Visual Intelligence to the iPhone 15 Pro and Pro Max.
When Apple released iOS 18 in September, the update remodeled the Control Center and gave you more, uh… control over how the feature functions. With iOS 18, you can resize controls, assign some controls to their own dedicated page and adjust the placement of controls to your liking. Apple has also introduced more controls to the feature, making it a central hub for all your most-used iPhone features.
With iOS 18.4, Apple continues to expand the number of controls you can add to the Control Center. If you have the update on your iPhone, you can add ambient music controls, and Apple Intelligence-enabled iPhones get a few AI controls in the menu, too.
Read more: Everything You Need to Know About iOS 18
Here’s what to know about the new controls and how to add them to your Control Center.
Ambient Music controls
Apple gave everyone four new controls in the Control Center library under the Ambient Music category. These controls are Sleep, Chill, Productivity and Wellbeing. Each of these controls can activate a playlist filled with music that corresponds to the specific control — Sleep, for example, plays ambient music to help lull you to bed.
Some studies suggest white noise could help adults learn words and improve learning in environments full of distractions. According to the mental health company Calm, certain kinds of music can help you fall asleep faster and improve the quality of your sleep. So these new controls can help you learn, fall asleep and more.
Here’s how to find these controls.
1. Swipe down from the top-right corner of your Home Screen to open your Control Center.
2. Tap the plus (+) sign in the top-left corner of your screen.
3. Tap Add a Control.
You should see a section of controls called Ambient Music. You can also search for «Ambient Music» in the search bar at the top of the control library.
Under Ambient Music, you’ll see all four controls. Tap one (or all) of them to add them to your Control Center. Once you’ve added one or all the controls to your Control Center, go back to your Control Center and tap one to start playing music.
You can also change the playlist for each control. Here’s how.
1. Swipe down from the top-right corner of your Home Screen to open your Control Center.
2. Tap the plus (+) sign in the top-left corner of your screen.
3. Tap the Ambient Music control you want to edit.
4. Tap the playlist to the right of Playlist.
A dropdown menu will appear with additional playlists for each control. So if you’re in the Sleep control, you’ll see other playlists like Restful Notes and Lo-Fi Snooze. If you have playlists in your Music app, you’ll also see the option From Library, which pulls music from your library. Tap whichever playlist you want and it will be assigned to that control.
Apple already lets you transform your iPhone into a white noise machine with Background Sounds, like ocean and rain. But Ambient Music is actual music as opposed to more static sounds like in that feature.
Both of these features feel like a way for Apple to present itself as the first option for when you want some background music to help you fall asleep or be productive. Other services, like Spotify and YouTube, already have ambient music playlists like these, so this could be Apple’s way of taking some of those service’s audience.
Apple Intelligence controls
Only people with an iPhone 15 Pro, Pro Max or the iPhone 16 lineup can access Apple Intelligence features for now, and those people got three new dedicated Apple Intelligence controls with iOS 18.4. Those controls are Talk to Siri, Type to Siri and Visual Intelligence.
Here’s how to find these controls.
1. Swipe down from the top-right corner of your Home Screen to open your Control Center.
2. Tap the plus (+) sign in the top-left corner of your screen.
3. Tap Add a Control.
Then you can use the search bar near the top of the screen to search for «Apple Intelligence» or you can scroll through the menu to find the Apple Intelligence & Siri section. Tap any (or all) of these controls to add them to your Control Center.
While Talk to Siri and Type to Siri controls can be helpful if you have trouble accessing the digital assistant, the Visual Intelligence control is important because it brings the Apple Intelligence feature to the iPhone 15 Pro and Pro Max.
Originally, Visual Intelligence was only accessible on the iPhone 16 lineup since those devices had the Camera Control button. Visual Intelligence was only accessible through that button.
With iOS 18.4, Visual Intelligence is now accessible on more devices and people thanks to the titular control in Control Center. But remember, Visual Intelligence is like any other AI tool so it won’t always be accurate. You should double check results and important information it shows you.
For more on iOS 18, here are all the new emoji you can use now and what to know about the recipes section in Apple News. You can also check out everything included in iOS 18.4 and our iOS 18 cheat sheet.
Technologies
Gemini Live Now Has Eyes. We Put the New Feature to the Test
The new feature gives Gemini Live eyes to «see.» I put it through a series of tests. Here are the results.

There I was, walking around my apartment, taking a video with my phone and talking to Google’s Gemini Live. I was giving the AI a tour – and a quiz, asking it to name specific objects it saw. After it identified the flowers in a vase in my living room (chamomile and dianthus, by the way), I tried a curveball: I asked it to tell me where I’d left a pair of scissors. «I just spotted your scissors on the table, right next to the green package of pistachios. Do you see them?»
It was right, and I was wowed.
Gemini Live will recognize a whole lot more than household odds and ends. Google says it’ll help you navigate a crowded train station or figure out the filling of a pastry. It can give you deeper information about artwork, like where an object originated and whether it was a limited edition.
It’s more than just a souped-up Google Lens. You talk with it and it talks to you. I didn’t need to speak to Gemini in any particular way – it was as casual as any conversation. Way better than talking with the old Google Assistant that the company is quickly phasing out.
Google and Samsung are just now starting to formally roll out the feature to all Pixel 9 (including the new, Pixel 9a) and Galaxy S25 phones. It’s available for free for those devices, and other Pixel phones can access it via a Google AI Premium subscription. Google also released a new YouTube video for the April 2025 Pixel Drop showcasing the feature, and there’s now a dedicated page on the Google Store for it.
All you have to do to get started is go live with Gemini, enable the camera and start talking.
Gemini Live follows on from Google’s Project Astra, first revealed last year as possibly the company’s biggest «we’re in the future» feature, an experimental next step for generative AI capabilities, beyond your simply typing or even speaking prompts into a chatbot like ChatGPT, Claude or Gemini. It comes as AI companies continue to dramatically increase the skills of AI tools, from video generation to raw processing power. Somewhat similar to Gemini Live, there’s Apple’s Visual Intelligence, which the iPhone maker released in a beta form late last year.
My big takeaway is that a feature like Gemini Live has the potential to change how we interact with the world around us, melding our digital and physical worlds together just by holding your camera in front of almost anything.
I put Gemini Live to a real test
Somehow Gemini Live showed up on my Pixel 9 Pro XL a few days early, so I’ve already had a chance to play around with it.
The first time I tried it, Gemini was shockingly accurate when I placed a very specific gaming collectible of a stuffed rabbit in my camera’s view. The second time, I showed it to a friend when we were in an art gallery. It not only identified the tortoise on a cross (don’t ask me), but it also immediately identified and translated the kanji right next to the tortoise, giving both of us chills and leaving us more than a little creeped out. In a good way, I think.
In the tour of my apartment, I was following the lead of the demo that Google did last summer when it first showed off these Live video AI capabilities. I tried random objects in my apartment (fruit, books, Chapstick), many of which it easily identified.
Then I got thinking about how I could stress-test the feature. I tried to screen-record it in action, but it consistently fell apart at that task. And what if I went off the beaten path with it? I’m a huge fan of the horror genre — movies, TV shows, video games — and have countless collectibles, trinkets and what have you. How well would it do with more obscure stuff — like my horror-themed collectibles?
First, let me say that Gemini can be both absolutely incredible and ridiculously frustrating in the same round of questions. I had roughly 11 objects that I was asking Gemini to identify, and it would sometimes get worse the longer the live session ran, so I had to limit sessions to only one or two objects. My guess is that Gemini attempted to use contextual information from previously identified objects to guess new objects put in front of it, which sort of makes sense, but ultimately neither I nor it benefited from this.
Sometimes, Gemini was just on point, easily landing the correct answers with no fuss or confusion, but this tended to happen with more recent or popular objects. For example, I was pretty surprised when it immediately guessed one of my test objects was not only from Destiny 2, but was a limited edition from a seasonal event from last year.
At other times, Gemini would be way off the mark, and I would need to give it more hints to get into the ballpark of the right answer. And sometimes, it seemed as though Gemini was taking context from my previous live sessions to come up with answers, identifying multiple objects as coming from Silent Hill when they were not. I have a display case dedicated to the game series, so I could see why it would want to dip into that territory quickly.
Gemini can get full-on bugged out at times. On more than one occasion, Gemini misidentified one of the items as a made-up character from the unreleased Silent Hill: f game, clearly merging pieces of different titles into something that never was. The other consistent bug I experienced was when Gemini would produce an incorrect answer, and I would correct it and hint closer at the answer — or straight up give it the answer, only to have it repeat the incorrect answer as if it was a new guess. When that happened, I would close the session and start a new one, which wasn’t always helpful.
One trick I found was that some conversations did better than others. If I scrolled through my Gemini conversation list, tapped an old chat that had gotten a specific item correct, and then went live again from that chat, it would be able to identify the items without issue. While that’s not necessarily surprising, it was interesting to see that some conversations worked better than others, even if you used the same language.
Google didn’t respond to my requests for more information on how Gemini Live works.
I wanted Gemini to successfully answer my sometimes highly specific questions, so I provided plenty of hints to get there. The nudges were often helpful, but not always. Below are a series of objects I tried to get Gemini to identify and provide information about.
Technologies
Today’s NYT Strands Hints, Answers and Help for April 11, #404
Do you want to play a game? Here are hints and answers for the NYT Strands puzzle No. 404 for April 11.

Looking for the most recent Strands answer? Click here for our daily Strands hints, as well as our daily answers and hints for The New York Times Mini Crossword, Wordle, Connections and Connections: Sports Edition puzzles.
Today’s NYT Strands puzzle might be tricky. You’ll do well if you watch a lot of a certain kind of TV competition. (I don’t, so I did horribly today.) If you need hints and answers, read on.
I go into depth about the rules for Strands in this story.
If you’re looking for today’s Wordle, Connections and Mini Crossword answers, you can visit CNET’s NYT puzzle hints page.
Read more: NYT Connections Turns 1: These Are the 5 Toughest Puzzles So Far
Hint for today’s Strands puzzle
Today’s Strands theme is: Buzzing in
If that doesn’t help you, here’s a clue: Win big.
Clue words to unlock in-game hints
Your goal is to find hidden words that fit the puzzle’s theme. If you’re stuck, find any words you can. Every time you find three words of four letters or more, Strands will reveal one of the theme words. These are the words I used to get those hints, but any words of four or more letters that you find will work:
- READ, READY, DIME, CHOP, CHOPS, PASS, DREAD, DOME, DOMES, WORD, SHOP, SHOW, WORDS, PARE, SWORD.
Answers for today’s Strands puzzle
These are the answers that tie into the theme. The goal of the puzzle is to find them all, including the spangram, a theme word that reaches from one side of the puzzle to the other. When you’ve got all of them (I originally thought there were always eight but learned that the number can vary), every letter on the board will be used. Here are the nonspangram answers:
- LINGO, PYRAMID, JEOPARDY, PASSWORD, CATCHPHRASE.
Today’s Strands spangram
Today’s Strands spangram is GAMESHOWS. To find it, start with the G that’s five letters down on the farthest row to the left, and wind across.
Toughest Strands puzzles
Here are some of the Strands topics I’ve found to be the toughest in recent weeks.
#1: Dated slang, Jan. 21. Maybe you didn’t even use this lingo when it was cool. Toughest word: PHAT.
#2: Thar she blows! Jan.15. I guess marine biologists might ace this one. Toughest word: BALEEN or RIGHT.
#3: Off the hook, Jan. 9. Similar to the Jan. 15 puzzle in that it helps to know a lot about sea creatures. Sorry, Charlie. Toughest word: BIGEYE or SKIPJACK
-
Technologies2 года ago
Tech Companies Need to Be Held Accountable for Security, Experts Say
-
Technologies2 года ago
Best Handheld Game Console in 2023
-
Technologies2 года ago
Tighten Up Your VR Game With the Best Head Straps for Quest 2
-
Technologies4 года ago
Verum, Wickr and Threema: next generation secured messengers
-
Technologies4 года ago
Google to require vaccinations as Silicon Valley rethinks return-to-office policies
-
Technologies3 года ago
Olivia Harlan Dekker for Verum Messenger
-
Technologies3 года ago
Black Friday 2021: The best deals on TVs, headphones, kitchenware, and more
-
Technologies4 года ago
iPhone 13 event: How to watch Apple’s big announcement tomorrow