Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Not surprising at all. It's become clear that what they've been doing with iOS all along is bootstrapping a new operating system. They started with Unix, not because Unix was optimal for what they wanted to do, but it's what they had and it worked well enough. Copland and Taligent and the rest failed because of Second-System Effect, which Apple was smart enough to avoid.

(As RMS wrote in 1983, "Unix is not my ideal system, but it is not too bad. The essential features of Unix seem to be good ones, and I think I can fill in what Unix lacks without spoiling them. And a system compatible with Unix would be convenient for many other people to adopt.")

Whereas such features as "background tasks" were simply a natural consequence of the architecture of Unix, Apple didn't expose that on iOS. Now they've finally gotten around to designing the background task subsystem they really want, and it's not just an emergent property of the generic Unix way. It's built for protecting battery life and privacy. It happens to be built on Unix processes (right?), but that's just an implementation detail for them.

They've been gradually deprecating Unix for 20 years, and designing the OS they want. iOS and its App Store allowed them to kill off large sections of the old interface at once. I fully expect inside of 5 years for all of Apple's operating systems to drop "UNIX" certification and become almost unrecognizable as "Unix". They've got enough market clout now that people will port Ruby/Python/Perl to a non-Unix macOS, just as they port them to Microsoft Windows.



I think you’re overthinking this. The vast majority of Ruby/Python programmers have been installing their own copies themselves, because Apple was pretty slow in updating them (Ruby is usually a few minor releases behind, Python is still 2.7 IIRC). People will just keep doing that, and nothing will change.

They may or may not be moving off Unix, but this isn’t evidence of that.


And it's to be appreciated. When installing my own Python version, I'm worried it might interfere with system tools that might be written in it.


As a counter-anecdote, the built in Ruby interpreter in Mac OS X Tiger was the way I learned to program and how I got interested in programming. I don't think a bare Ruby interpreter is the best way to get into programming, but Apple hasn't been great at encouraging programming among kids. As a kid, its development tools were totally baffling, where Ruby was much easier to understand.


> Apple hasn't been great at encouraging programming among kids

You should check out what they’ve been doing for years with Swift Playgrounds on the iPad


But that's teaching a new generation of coders a set of tools designed only for Apple. Ruby (Python etc) are open and easily available on other platforms, Swift may be in the future but it isn't now.


Swift is open and available on other platforms. I can install it on a raspberry pi if I want.

https://swiftreviewer.com/2018/12/21/swift-programming-on-ra...


Once you’ve learned one programming language, it’s not hard to learn a second. It’s not terribly important what you’re first language is, and Swift isn’t a bad starting point anyway.


Huh? You can download Ubuntu builds right on their site

https://swift.org/download/


Funny. I learned on qbasic and I remember they were teaching dr racket at ubc to first year students. They were teaching java when I was in engineering and my career has basically gotten zero Mileage from that stinking heap of uncollected garbage.


How about using a web browser with JavaScript?

Maybe not the best language for someone new but its widely available.


How about just a web browser?

You can run almost any language from a web browser these days:

https://repl.it/languages/

A lot of educational sites offer embedded interpreters like this.

It has never been easier to start to learn programming. The real challenge nowadays is maintaining attention...


One of the things that drew me to Macs as I started to learn programming in '05 was that I could simply open any macbook and type python/ruby/perl and write some code.


Different time. Now you got Homebrew


Which installs itself based on the system Ruby. Will be interesting to see how that plays out.


Like it works now with older versions of macOS and on Linux: it downloads a Ruby binary when it's not available on the host system.


They just pack a statically compiled Ruby interpreter with Homebrew. Not very complex or novel. Disadvantage: install becomes more bloated (larger size).


Sure, it’s not a hard engineering challenge. However, installation instructions right now are as simple as “run this one-liner on your terminal”. That’ll have to materially change.


> Sure, it’s not a hard engineering challenge. However, installation instructions right now are as simple as “run this one-liner on your terminal”. That’ll have to materially change.

It could just as well still remain a one-liner. Every system Homebrew is installed on has Bash installed (not the latest, but still).

Instead of

> /usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/in...

It could become

> curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/in... | bash

With the install script containing a payload containing the current Ruby script plus a Ruby interpreter. Still a "one liner".


> Every system Homebrew is installed on has Bash installed

Ironically, they are also moving from bash to zsh for default shell. Again not an unsurmontable challenge (just make sure the bang is correct in your scripts, bash will likely still be around somewhere in macOS for the foreseeable future) but another little hurdle to mind.


As far as I'm aware they're not changing anything about which shells are installed or which executable /bin/sh points to they're just changing the default login shell for a new user, so nothing should have to change about the scripts that are (or have been) written.


Doesn't surprise me given they dislike GPLv3 and Zsh is MIT.

Personally I try to use #!/bin/sh as much as possible, and Fish as my default shell.


upvoted for fish


Take a look at the installation instructions for the Rust compiler.

https://rustup.rs/


But you have to download Xcode if you want a C compiler!

On the up side though, clang has made the C/C++/ObjC world a much better place.


I had the opposite experience with AppleScript - the development tool (Script Editor) was simple enough for me to understand that it was the way I taught myself to program when I was 13.

(OK, I learned Logo in school when I was 7, but programming became way more interesting when I discovered AppleScript's APIs to control all the programs on my Mac).


>As a counter-anecdote, the built in Ruby interpreter in Mac OS X Tiger was the way I learned to program and how I got interested in programming

And you think a one-click download and install of Ruby would have prevented that?

Not to mention Apple also has Swift, and Swift Playgrounds and so on for kids these days.


Sure... but that was then. I too had a real hard time learning Ruby back in the day. But now if you google any Ruby or Rails how-tos they will include a one-line Brew and RBENV install.

So I get what you are saying but I think we are in a new world here.


Swift Playground for macOS? One can hope!


Technically Xcode has supported Playgrounds since before the iOS app existed, though if you mean the teaching content, it doesn't have that.


Right, I think most developers know that it's a bad idea to mess with the old, unmaintained system versions of Python, Ruby, or Perl. Almost everyone I know uses Homebrew to install the bleeding edge versions separately, and that will save you a world of pain.

For Python folks, just install pyenv/pipenv and you can easily maintain `system` (Python), Python2.7x, Python3.7.3, &c as separate environments. For Ruby people it's rbenv.


Or anaconda!


This alone isn't evidence of that, but it's not the only data point we have. We've got 20 years of observations. They dropped X11 (and didn't use it for the main display in the first place), created new process APIs (like GCD and background tasks), deprecated standard libraries like OpenGL (and don't support Vulkan), designed their own programming language, etc. All of these point in the same direction.

What will be left? Will it still be "Unix" in any meaningful sense just for having a C compiler and running an old version of Bash? Microsoft Windows did that, too.


> They dropped X11

You don't have to be anti-unix to do that though, any reasonable organization would have done the same. I agree with the rest though.


    They've got enough market clout now that people will 
    port Ruby/Python/Perl to a non-Unix macOS, just as 
    they port them to Microsoft Windows.
Ruby on Windows is a nightmare.

To be specific, Ruby itself (the core language and core libraries) are pretty good on Windows. Great choice for a scripting language, or whatever else you want to do.

But libraries like ActiveRecord, Rails and their myriad dependencies are very very hit or miss on Windows and there's not a lot of support when things go badly.

A lot of the issues are simple - like dead simple. Lot of gem authors don't bother to make things like file paths (forward slash vs. backslash) system agnostic. Or the gems depend on compiled binary stuff like Nokogiri or ImageMagick and those are always a bit of an adventure on Windows.

If MacOS drifts farther away from Unix they will cease to be the platform of choice for many developers currently using it today.


> file paths (forward slash vs. backslash)

Windows file APIs have always supported forward slashes in file paths, even before there was Windows. This goes all the way back to MS-DOS 2.0.


The Windows Subsystem for Linux fixes a lot of these problems, you can even point VS Code at Ubuntu now on Windows and avoid CMD entirely.


I was someone who installed the first beta of Windows that included this, just to try it out for Rails. It had LOTS of problems. I filed bugs. I kept trying over the next couple of versions, and finally gave up. I tried it again a few months ago, and, again, ran into a show-stopping problem of some sort, and gave up again. Are you saying that you use it "in anger" for serious development, and have no issues?

People complain about doing Rails on Windows, and, while there's definitely extra friction, and I hate Windows in general, I have really good luck with RubyInstaller. I used to have a hard time with ExecJS, but now I just install Node (on either Mac or Windows), and I'm usually off and running.


I use WSL (without anger) for serious development, for about two years now and have had almost no issues.

Running 'Pengwin' (Debian) Linux via WSL, RVM using Ruby 2.4.1 + Rails 5.1.7 and Ruby 2.6.2 + Rails 6.0.0.beta3

Today, everything works just as expected, right out of the box with no effort. This includes ActiveRecord (to SQLite, MariaDB, and PostgreSQL), including Node.js / Asset Pipeline, Prawn PDF and ImageMagick integrations, uploads to AWS, email integrations, capistrano-based deployments, Heroku Gem + integration, and so on.

It's gotten to the point where we spend more time helping OSX folks figure out occasional Homebrew weirdness, than we do helping Windows folks with WSL. Especially since WSL people can almost always just re-use any Ubuntu instructions verbatim.


except when you install a native module in WSL and then try to run it from CMD. WSL is no different than having a headless VM and comes with all of the problems of one


I've done most of my Linux development on WSL for the last year. It stuggles sometimes under certain types of heavy IO load, and there are ocsssional compatibility issues, but overall it's been impressively reliable and convenient.


I am using WSL more and more often nowadays and it's really good now, it works just like you'd expect Ubuntu (or other distro you choose) to work. The only thing I've had to do is turning off anti-virus real-time protection for linux subsystem.

Hell, not only VSC supports WSL. RubyMine can manage rbenv/rvm on WSL and more https://confluence.jetbrains.com/display/RUBYDEV/How+to+add+...


The funny thing is I have been a rails dev for years and didn't even know it didn't work on windows because I have never tried and don't know any dev ever using windows.


That just proves you live in a bubble :) I never got into RoR because it sucked so much on windows and every single tutorial was OSX focused. I just sticked to alternatives that worked.


So which companies are actually doing RoR development on Windows? Do they also deploy to Windows Server (?) in production? I guess I live in multiple bubbles because I’ve worked for tech companies in several different major metros and I’ve never heard of this as a mainstream thing.


It is not like that. My point is that there's a barrier to entry to RoR that you should own a Mac. I gave up because my home setup is Windows based and I won't change it for the sake of a hyped framework or tool.

But none of this is surprising considering where RoR is coming from, I'm not sad or upset about it either :)


You don't need a Mac. RoR works perfect on Linux as well.


it seems like you’re blaming RoR for a problem that is actually due to windows being “different”. almost every OS besides windows is a nix or nix clone and POSIX compliant or certified. so who’s really the problem here?


In the early days of rails say 2006, it was definitely more common place... but I guess there were fewer of us back then ... I know as a maintainer of curb I have a few windows users


I started with a ruby/rails project in 2007 and after 3 days of fighting my work issued windows machine I just went out and bought a white macbook and things were up in running in an hour or two. I've never looked back.


> But libraries like ActiveRecord, Rails and their myriad dependencies are very very hit or miss on Windows and there's not a lot of support when things go badly.

I haven't done Rails for a while, but when I did Rails (including ActiveRecord and all its other components) Just Worked o bWibdows (lots of the peripheral ecosystem of Rails and Ruby more generally was a nightmare, but Rails itself was great), and overall the Ruby (outside of Rails) experience on Windows has gotten better with Ruby installer + DevKit, which hasn't closed all the gaps but does mean that even libraries with C extensions frequently just gem install and work on Windows; I'd be surprised if Rails itself had gotten worse in this area.


But macOS uses '/' as a directory separator already. They're not going to arbitrarily change trivia like that -- at least, not short of killing the concept of a filesystem altogether. (True, if anyone was going to do that, Apple would, but APFS is brand new. That's not going anywhere.)

Apple isn't changing things just for the sake of change. They're removing 1970's-isms where they hold back the platform. Languages like Ruby/Python/Perl also clean house, from time to time, to remove old cruft. I can't imagine any legacy feature of macOS that Apple would want to deprecate which would make it more difficult to maintain a good port of Ruby/Python/Perl. These are regular general-purpose programming languages now, not just Unix scripting languages.

Yes, Unix has been one of the developer systems of choice for the past couple decades, but it wasn't always so, and it hasn't been the only one. I hear more developers than ever using and enjoying other systems like Windows (Powershell) today. Even Linux today is pretty far from traditional Unix. I just don't see perfect Unix compatibility as a necessary component like it was in 2001.


> macOS uses '/' as a directory separator already

Depends on what level of the UI you're dealing with.


> But macOS uses '/' as a directory separator already.

IIRC, HFS actually uses a ':'.


Mac OS versions prior to OS X used ':'.

Mac OS X, initially released on 24 March 2001, uses '/', like other Unix-derived systems.

Also, like other unices, OS X uses '\n' as the line ending. Prior Mac OSs used '\r' as the line ending. Windows uses '\r\n'.


In A/UX you could use both.


Similar, HFS+ used to provide the option for case insensitivity. Both things were built in for legacy support reasons during the transition from OS 9 to X, as with Carbon.


Macs still come with case insensitivity on by default.


but in Finder you can use "/" in filenames, and can't use ":". They got swapped if you look on the file from Terminal.


Pretty sure HFS doesn't actually store any directory separator. A directory, after all, is just another file, each one containing its own children. Classic Mac OS used ":" when it needed to use something as a path separator.


> Ruby on Windows is a nightmare.

The same stuff with Node.


Not sure why? Yeah there were times with damn node-gym not compiling for whatever reason. But nowadays things run pretty smooth.


Besides paths, Http_proxy settings on enterprise networks is a major pita.


i'm still using cntlm for that (as a proxy in front of the enterprise proxy) : http://cntlm.sourceforge.net/

(sourceforge is the main link, probably best to visit with ublock origin enabled)


paths, compilation etc. Windows isn't main server platform, so people seems like just don't care.


You can use / on Windows in node perfectly, and Windows node now installs required devtools for native modules at the end of the process.

It's pretty easy these days.


You can use it, yep, but devtools aren't enough. You'll meet a lot of issues sooner or later. Windows is second-class citizen for Node, Linux or macOS are much better - just share my experience.


Being a second class citizens is very different from being a nightmare. I've been able to build some fairly complex apps (TypeScript, Next.js, Relay, SQLite, Express) without running into any issues at all, and if I do hit a roadblock I know I can bail out to WSL or a VM.

The experience isn't as nice as macOS, but it's not terrible by any stretch.


Personally I moved to Windows about 2 years ago, have been developing node all day during that time and deploying on Linux. It's been great since about node 8. What have your issues been?


Same. Wsl is great. Running ides in wsl even works great, but with vscode, I don’t even do that much anymore. It’s like being on Linux for dev, but having Windows for power management, high dpi settings, drivers, etc. It’s working great, hardware selection is nice, games work. I’m really digging it lately.


> power management, high dpi settings, drivers, etc.

Have the same stuff working on Linux and deploy anything without VM/WSL/telemetry/etc. Guess similar situation with macOS.


> Ruby on Windows is a nightmare.

Nightmare? No. Suboptimal? Yes.

We're running some legacy rails 3.2 against oracle db on windows, and it works well enough. It sure is a lot more pleasant on linux, though.

Biggest challenge on windows is to get a c compiler working/get binary dependencies working. Other than that, you need a version manager (same for other platforms really, unless you're aiming to shoot yourself in the foot).

Ruby on windows is definitely not great, but "nightmare" is a bit much.


We probably agree on an objective level. I would agree with you that nearly all of it works nearly all of the time.

But that's my idea of a nightmare - when things are suboptimal and you're often scrambling for solutions on a second-class platform. When I say "often" I don't mean "every day", but in my experience it happened enough to make me swear off Rails on Windows forever.

Of course maybe 5.x is different. My experience was in the late 3.x and early 4.x days


> Ruby on Windows is a nightmare.

It hurts when I do this ... [punches self in face] ...


[flagged]


Hint, not every OS is an UNIX clone. Thankfully.


Perl and CPAN works well enough on windows :-)


The main issue here is that they don't want the hassle and responsibility of breaking everyone's software each time they release a new system-level python.

They have a problem deprecating python2.7 as it is, the smartest deprecation is not to release a python3+ version at all.

You see a similar issue with python on Debian. Python 2.7 is still the system python for all of the Debian tree, and it is very challenging to migrate to a new version because of the huge impact it has.

That's not to say MacOS won't be a place where python is happy to run - just that Apple don't want to dictate a specific version of it in their releases (i.e. developers/users/applications should install one.)


Would you say that RedHat doing almost the exact same thing[1] means they are moving away from being Unix-like as well? This is just an idea who's time has come.

You might be right about OSX and it's derivatives moving away from being so unix-like, but by itself this is just sensible housekeeping they would have to do at some point anyway. It's just not reasonable to expect them to include and maintain deprecated language runtimes indefinitely.

[1]https://developers.redhat.com/blog/2019/05/07/what-no-python...


I'm sure that others here are well familiar with other officially certified UNIX operating systems[1], but given many widely used Unix-like systems not on the list, I don't think Apple gains anything from the certification. Moreover, it probably is deleterious, forcing them to adhere to system designs that made sense in the 70s and 80s.

1: https://www.opengroup.org/openbrand/register/


I'd be surprised if Apple spends all that much money/effort on official Unix certification, precisely because I doubt Apple would bother. How many customers are choosing macOS over something else because of the Unix certification? In a world where Linux exists?

(This discussion, of course, has nothing to do with whether or not macOS adheres to Unix principles or compatibility.)


I disagree. MacOS is hugely popular in academia. A significant amount of researchers use MacBooks as it allows them to have the best of both worlds by having a machine that runs MS Office for preparing proposals and dealing with Admin as well as a UNIX system that lets them ssh into whatever workstation or cluster they’re running jobs on. Sure Windows is catching up in this regard but MacOS has had the advantage of being a much nicer experience.

Getting your products into Universities is a great way to maintain market share. This is true for software like Matlab and even for banks where most students will keep banking with the same bank for decades.


This. I've asked a macbook at work specifically for this.

I can have an Unix workstation while still being able to run Outlook, Excel and Powerpoint.

I don't want to use Windows and I don't want to use that pile of crap that LibreOffice is... The mac was a good compromise for me.


>> (As RMS wrote in 1983, "Unix is not my ideal system, but it is not too bad. The essential features of Unix seem to be good ones, and I think I can fill in what Unix lacks without spoiling them. And a system compatible with Unix would be convenient for many other people to adopt.")

I have a historical question and I wonder if anyone in this thread has the answer: why did RMS (and, presumably, by then his cycle) had to wait around for Linus Torwalds to write a kernel so they could have one for GNU? RMS in particular has a reputation as a legendary hacker. Was he not able to write his own kernel?


There is GNU/HURD that was supposed to be the kernel. However, its development hit roadblock when microkernel design ended up extremely difficult to debug. Sometimes you can't predict real complexity from the architectural drawing board and prevailing hype in academia, pointing to microkernels as the future at that time. Linus just wanted to implement his own, simple UNIX kernel, based off proven principles, without doing original kernel research.


In addition to sibling comments, remember that Linus announced Linux as:

"Hello everybody out there using minix -

I'm doing a (free) operating system (just a hobby, won't be big and professional like gnu) for 386(486) AT clones. (...)"

GNU couldn't use MINIX due to the license (unlike minix 3), and as I recall, the bsd's were also in somewhat unclear license terms.

Gnu made the libc, compiler and various userland tools - but no-one did a" simple get it working"-kind of kernel, and then Linux came along.

I suppose part of this might be because in early gnu days, most gnu hackers used university Unix machines - not personal 80x86 for development.

If you bough sun hw you got Solaris. Only if you bought a pc clone would you be wanting a "real" os...


Thanks.



They tried it's GNU Hurd.


One good thing about programming on Mac is the UNIX environment which basically works just like on Linux. Makes it great for finding questions/solutions that work (vs on Windows for example).

Do you see that changing as macOS gets away from UNIX? That would probably be a bad hit on apple if Linux becomes then the de facto programming OS when Apple drops UNIX


> if Linux becomes then the de facto programming OS

It is already. Deployed any production service on macOS, of late? Windows and MacOS both, are now forced to run Linux compatibility layers (Docker, WSL...); the Mac one just happens to be thinner because of the choices made at NeXT back in the day.


What makes you think Apple has any plans to move away from Unix?


The alarmist first comment in this very thread...


from the comment I am answering to

> They've been gradually deprecating Unix for 20 years, and designing the OS they want.


I don't see this anti-Unix stance at all in what Apple is doing here. The "P"-languages that shall not be named were never part of Unix/POSIX, and arguably violate Unix principles in that they want to establish a language-centric ecosystem when the Unix userland is about small, language-agnostic programs working together.


>and arguably violate Unix principles in that they want to establish a language-centric ecosystem when the Unix userland is about small, language-agnostic programs working together.

That was never a real consideration where distros/vendors are concerned (UNIX or Linux). Just a nice-to-have goal for the command line userland (and even that, more back in the day).


I would like to believe that the people who use those features would make the much simpler switch to Linux but with the new WSL and (ridiculous) terminal I know a lot of them will be persuaded to transition into the special hell that is Windows.


Ruby/Python/Perl is not UNIX.


Weird, I thought the Unix-likeness was a huge selling point of macOS ("the most user-friendly Unix on the market")


for the tech crowd that's for sure, but how much is it for their overall market ?


The jump from os 9 to OSX was about as significant as the jump from win9x to the NT based XP. Suddenly, multitasking stability without a huge dollop of luck was a reasonable expectation.


> ...just as they port them to Microsoft Windows.

Just as they ported them to MacOS Classic back in the day.


So is google with chrome os first and then with fuchsia


A different solution, and possibly better, would be to run Unix/Linux in a VM rather than worrying about it being native in the system.

If I'm not mistaken, current best practice is to somewhat isolate third-party developer software anyway, using homebrew, a VM or other solutions (not sure about Docker - don't use it).

I can see where it would be an issue with developing for MacOS or iOS, but third-party web development doesn't really need anything that is installed on the system, and in my experience, it's actually better not to base it on the MacOS default packages.


Homebrew installs its own binaries in a way that they won't interfere with the macOS-supplied versions. It doesn't use a container or anything like that. Certainly not a full-on VM.

Maybe I'm old-fashioned (feels weird to say that as a 25-year-old, but here we are), but I quite like not having everything be in a container all the time.


> Homebrew installs its own binaries in a way that they won't interfere with the macOS-supplied versions

Correction: Homebrew usually installs its binaries in a way that doesn't interfere with the ones provided with the system. It will occasionally override or shadow them and things will break.


> It will occasionally override or shadow them

That's extremely rare, these days. When a conflict is possible, they just drop the new version in the cellar and tell you what to do if you really want to risk shooting yourself in the foot.


> A different solution, and possibly better, would be to run Unix/Linux in a VM rather than worrying about it being native in the system.

Just like WSL 2.0 (https://devblogs.microsoft.com/commandline/announcing-wsl-2/)?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: