Why do devs use Mac?

Hi, yesterday I went to an internship interview for a company which develops 'all kind of things' (websites, mobile apps, software), I noticed all computers as far as I could see were Macs, I'm not hating, I'm pro-choice, however... Why?
IMO they're expensive but their hardware is always one or more generations behind.

Please try to be objective, let's unravel this mystery :thinking:

Probably something about software optimization.

Better platform development as far as i can tell. Can have all three worlds of mac, linux, and windows all in one machine

2 Likes

Small fish here. From what I have observed a lot of devs use a lot of different tools. As soon as a dev/dev shop claims one solution to the wider problem of development, they don't know shit.

1 Like

I am sure there have been a few threads on this in the past.

Basically... it's fashion; a reluctance to use windows in the past but an inability to get on with Linux.

Some of the devs I know are now Win 10 converts - it has a more up-to-date version of bash and hyper-V works well.

1 Like

It's unix. It's candy coated unix, but it still has access to a real shell with "real" tools.

95% the perks of Linux - without any of the hassle (driver issues, etc).

My 2cents.

5 Likes

So it's an it just works thing?

Only reason to use windows is if you have to do .NET development, and even then it's a pretty terrible environment compared to every other option.

That said a shitty i5 laptop and linux will give you a comparable experience for most things, but apple is generally considered easier.

Computerphile asks all their hosts, mac or pc?

Mostly Mac & Windows with only a few Linux... and a chromebook.

1 Like

yes

"first I was linux and then too much hassle installing sound drivers and updating operating systems(...)"
Sounds like it just works to me. (Nothing wrong with that)

Kind of. Its a bit more complex than that.

The truth is that you could use a bunch of dell work stations running windows, and they would work just fine too.

As long as the company had a decent IT team they can really make anything work.

When a company wants to buy computers in bulk, they can generally open a business account with the company. This will allow for faster RMAs and cheaper prices on the computers.

The 3 main business partners are dell, HP, and apple. HP computers are kind of shit. They break pretty regularly.

So that really leaves dell and apple. With dell, you really are going to have to get a tower and monitor. They do have all in ones, but the ones I have seen are kind of shitty. So that means that the dells will be taking up more space in the office. The other issue is that the larger 27 inch high dpi monitors from dell are kind of pricey. I have not seen a high dpi dell screen for less than 400 bucks. And a good quality tower pc is right around 600-800 bucks. So already you are looking at at least 1K for a similar experience to an imac.

Imacs at the 27 inch size also now come with 5K screens. The cheapest one starts at 1800 bucks, and if you get them in bulk I believe they come down in price. Higher dpi is going to mean sharper text which means less eye strain and fatigue. These are huge factors for a company that is built on the long hours of persnickety developers. But obviously it is 800 bucks more than a dell system that would largely do the same job.

The other thing to keep in mind about apple computers is that they can run windows. But windows computers can't run OSX. So if you are developing software for both operating systems, then an apple computer is almost mandatory for testing purposes.

So this can actually become a VERY complex question. Is it worth it to spend the money or not.

1 Like

If you are going to write IOS apps you have to be on a Mac.

2 Likes

I can tell you as a sys admin, if I can't have a Linux laptop, I wish companies would at least give me a Mac to work on. I cannot count how many times this year alone Windows crapped out on me when I needed to get something time sensitive done. It's like Windows knows, "Oh, would this be the absolute worst time to shit the bed? Well then..."

6 Likes

From my personal experience in terms of application quality going from low to high: Windows, Linux, Mac. On Mac, the apps are just much more polished and well supported. The downside of course is that most of it is not free, which companies don't really care. The only piece of software that I found more polished on Windows than on Linux/Mac is Krita and Davinci Resolve.

Also Mac has an almost indistinguishable Bash terminal experience so you have the option of using CLI if you wanted to, while on Linux it's almost mandatory.

Generally you don't need the latest hardware to get work done. As a pre-Mac Pro 4,1 owner running on 8 year old hardware didn't really slow me down, instead access to the Mac App ecosystem made things easier. I ultimately switched over to Linux because I found myself using CLI a lot more frequently to the point that I outgrew GUIs.

I am web dev, using mac.

A couple of years ago, around the snow leopard era, macs were simply the best hw+sw combination you could buy for non-enterprise web development (php, ruby on rails and django at the time). The hardware quality was far better than anything else, except the highest end business laptops while looking great at the same time. The unibody macbook pro was one of the prettiest (both sw and hw) looking laptop when it came out, while it was relatively light with great battery life. The OS was simple to use, and worked well out of the box, and at the time provided almost everything natively what Linux could provide for a web dev. Even if it is more expensive than a PC, not needing to debug why skype video calls don't work worth the money.

Around 2010 I have seen an explosion in the mac share among my peers. That lead to a lot of tools and know-how developed around the mac, creating a network effect.

Nowadays with Apple losing focus on mac, and PC manufacturers caught up on the hardware quality. I see macs on a slow decline. More and more people I know is considering going back to Linux. Now that everything is a web application, and Linux has matured a lot in the last couple of years, I think the switch back won't be that hard.

Generally it's down to the broad install base you can develop for. This has changed in recent years as developing for Mac and IOS has been made feasable on windows based systems with the increased leniency in VM and other tools. However on a Mac, you can develop for Linux, Windows, Mac, Android and IOS, all from the same system. A few years ago, Windows was limited to Windows, Linux, limited Max development and Android. Linux is limited to Linux, some Windows and some Mac without the use of VM or Wine.

Now obviously there are limitations. You're not going to be working on the latest greatest AAA game on a Mac, Nor are you really going to be doing a lot of testing of mainframe systems. This is where Windows and Linux holds strength. But for Web apps? mobile apps and mobile games? native Mac software? Your best option is Mac. Not saying Windows or Linux is wrong, these days they're great options and have their own benefits but going from a few years ago, this was the difficulties companies had

There is so much more trash on windows.

The app ecosystem on OS X - maybe it's even the graphic standards that come out with OS X development, that makes apps seem more mature. As a developer, I've tried going to windows, and I like hyper-v, but I wouldn't trade my mac apps, or the unix eco system for windows unless I played games at work. I would run some linux, and beg my designers to do half my job.

Hyper-V seems like a great app, but it too is sort of painful and cumbersome - just like the fact that it's built into professional, whereas I spend $100 a year on Parallels developer subscription per machine.

Its a mainstream version of Unix. Essentially the same benefits of using Linux but its mainstream and I probably allows admin control or something like that.

1 Like

1 Like