Let's say that back in 1930s when they were assigning frequencies for the broadcast television channels, they allocated enough extra bandwidth for a future color (chroma) signal, apart from the existing monochrome (luma) signal.
If the bandwidth was available, would it have been possible to include separate chroma and luma components in the broadcast signal without the two interfering with each other, thereby producing a much cleaner color image while maintaining backward compatibility with the original B&W TV sets?
Isn't that essentially what happened? In the 1930s, the channels were allocated at 6 MHz spacing, but only needed about 3 MHz of bandwidth for the luma channel. There might have been some foresight involved, but the gaps between channels also allowed for cheaper, less precise tuners. Then in the 1950s they added a 1 MHz chroma signal on a 3.58 MHz upper sideband, expanding the channel bandwidth to just under 6 MHz.
6 MHz was there from the start. If it wasn't, the CBS field-sequential might have won. Both government (FCC) and industry (RCA) were pushing for a backward compatible system and there was enough bandwidth to pull it off. OP asks what would've happened had chroma been given its own separate, non-overlapping band. That was not possible while maintaining compatibility.
You could argue that if extra bandwidth were available it might be better allocated elsewhere for improved picture quality. But yes. It would've reduced lots of engineering complexity and probably would've looked very much like PALplus.
From my years of iOS development—and based on https://xcodereleases.com typically ships two major Xcode updates each year:
- X.0 (September): bumps Swift, SDK versions, etc. It also tends to have a noticeably longer beta cycle than other releases.
- X.3 or X.4 (around March): bumps Swift again and raises the minimum required macOS version.
Other releases in between are usually smaller updates that add features or fix bugs, but they don’t involve major toolchain-level or fundamental changes.
Today’s release doesn’t bump the Swift version, which suggests the core toolchain is essentially the same as Xcode 26.2—so it makes sense that the minimum macOS version wasn’t raised either.
Agreed - I tried installing Xcode26.3 on my mac running Sequoia and there's no "intelligence" pane in Xcode settings to connect Claude like there is in the docs.
Computer Chronicles and MotorWeek were fixtures of my Saturday afternoons as a young kid in the 80s. 35+ years later both shows became fascinating and priceless time capsules of the era.
For me it was those two, but also Bob Ross (of course!) and Star Gazers (Keep looking up!). I enjoyed drawing and astronomy and the entire reason I got into computers was to program the equations from the astronomy with your pocket calculator book.
That's also why Apple renamed OpenStep to Cocoa. Java was supposed to be the primary development language for Mac OS X (because Java and Cocoa go great together).
For a while you could write OSX stuff in Java. I did. They also had some pretty okay JNI bindings for stuff like quicktime, so I was able to write a java application that used quicktime to load and display videos. I needed to analyze video streams for my dissertation, so I wrote some custom visualization stuff in java that used the quicktime bindings. Good times.
When you hear people complaining about Time Machine you can stop listening once they mention their NAS. Time Machine over the network has never been properly supported and is inherently unsafe.
Time Machine to a local drive connected via USB is great.
If you want to backup over the network you will have to find another solution.
I'm very surprised it's organized as just a single 162 kB source document instead of being divided into smaller modules to make the code easier to work with, speed up build times, etc.
Were PDP-10 text editors of the day able to easily work with such large documents? And how long would it typically take to assemble all of that code in one go?
I don’t think TECO is inherently bad for this size of file. What might be tough is, if anything were compute-bound, and the processor were loaded down with other jobs, you could have the usual timeshare-related problems. If you wanted to look at a chunk of text, it would be a problem to send it to your TTY.
If you were using a KA10 variant, your process could get evicted pretty easily because the core memory requirements of all the running jobs had to be less than 256K words. The variants with paging would have a more efficient method for virtual memory.
Compile times would be affected by the same issues.
While it's true that early production machines had reliability problems, the same was also true for the C64. The machine we got for xmas 1983 was as solid as a tank.
The tapes were extremely robust and resistant to abuse, much more so than floppy disks. I tried to fry tapes, and couldn't.
For games, the tape drives were surprisingly effective. Sequential data transfer rates are faster than the C64 disk drive, and unlike the C64 they operate independently from the main CPU, with DMA access to the full 64 kB address space.
This means that many games were up and running in seconds and could load upcoming levels in the background while you were playing the game.
(The tape drives were much less effective for random-access, file-based storage, as the seek times were obviously atrocious compared to a disk drive.)
First-party software was also very high quality.
The problem was the business plan. Coleco made the same mistake as Atari and Texas Instruments, in that the business plan was modelled after the game console business. The tapes were expensive and proprietary when they didn't need to be, and the 3rd-party software ecosystem was completely locked down. Technical info was unavailable for hobbyists and independent developers.
By the time the Adam is released, the C64 and Apple IIe are already entrenched in the home markets with exponentially expanding library of independent software. The Adam's locked-down ecosystem couldn't complete.
It only took one year for hobbyists to completely reverse engineer the Adam, at which point some interesting independent software starts to appear. But by that time the business was already dead.
> Sequential data transfer rates are faster than the C64 disk drive
The C64 disk interface is notoriosuly broken. The C64 and the 15x1 drives effectively had the same relatively fast processor but with a tiny pipe connecting the two.
Let's say that back in 1930s when they were assigning frequencies for the broadcast television channels, they allocated enough extra bandwidth for a future color (chroma) signal, apart from the existing monochrome (luma) signal.
If the bandwidth was available, would it have been possible to include separate chroma and luma components in the broadcast signal without the two interfering with each other, thereby producing a much cleaner color image while maintaining backward compatibility with the original B&W TV sets?
reply