Yeah, politics regarding linking to an external editor. The syntax highlighting, code completion, live pre-compiling of code, etc. in things like xcode are simply not possible using GCC as a back end.
I believe Apple also had issues getting the objC support updated in GCC as well (guessing adding swift was a total no-go, and even if not, the objC support would have left a bad taste in their mouth, so to speak), so they took their bat and ball and put money into clang.
But no, apple are horrible for open source
Also, for years, back in the bad old days (i’m talking kernel 1.x, 2.0x, 2.2x), Linux had to be compiled with GCC 2.73 (IIRC) ONLY due to compiler bugs in that particular version being the right set of bugs for linux to work against
Even back when GCC was dominant, it was pretty buggy.
Not quite sure what the issue is with the culture though? If anything, we need more “free” software (free as in speech not beer) in this point in time.
Granted some people there take it to such extremes like one guy I argued with saying how hard if not impossible it is to run purely on FOSS software as much of a good idea as it might sound. Shit, I ain’t running “FSF Approved” software considering I need some blobs to get shit to work.
Still, never realized how much Clang took off recently. Now that’s something I gotta look up.
there are tons of cultural issues in FSF-land, but the biggest one is an allergy to collaborative standards.
They dogfood extensively, which wouldn’t be so bad in and of itself, but they don’t inform any of their design decisions based on the prior art when they do it. This style of project administration tends to lead to cargo cult style insular dev cycles, and cascade problems in the wider ecosystem because no one wants to learn from the past bodies of knowledge or acknowledge expetise or competence outside of their particular licensing scheme.
Take canonical’s projects, for example:
any of these ever become standards? no? I wonder why
I understand the need for Apple wanting their own proprietary graphics API’s with Metal. It’s just a good way to consolidate iOS and OSX software.
iOS has leaps and bounds more software available for it than OSX, and it is clear that Apple wants to pull that software over to their desktop environment with little effort. But at the same time, it seems like something that Apple should have put in place a long time ago.
Treating Vulkan as a second tier API is just Apple’s way of getting developers to adopt Metal instead. Yeah, there is MoltenVK, but it seems like Apple decided to throw up a few additional hoops just to discourage it over Metal.
Also, Apple deprecating OpenGL isn’t anything new. They have been doing that for years now with the lack of updates. OpenGL has always been trash on Apple’s home PC’s in relation to Windows and Linux. Shame about OpenCL.
Metal was released in 2014 and has been in development, and used internally by Apple long before that. The development tools, etc. were available to the public for it in 2014.
Vulkan was released 2 years later. That ship sailed 4+ years ago. Maybe if Khronos (openGL/Vulkan body) had something useful to offer back in say 2011-2012 when Apple would have been looking for a new API, things would be different. But they didn’t, so they aren’t. It’s as much due to it being supplied much later as anything else that it is treated as second tier.
So it’s no surprise that Vulkan’s duplicate functionality will be supported via shim, rather than part of the core platform. As stated above, DX12/Vulkan/Metal are all similar, porting from say DX12 to Metal will be a lot easier than porting from DX11 to OpenGL would have been.
Basically, the Khronos Group were still working on their OpenGL next API, which was lagging behind in development. But then AMD handed them off the source code to their failed Mantle API, which caused them to switch gears to developing Mantle into Vulkan.
Microsoft pretty much based DX12 on Mantle as well. They did describe it as Mantle-like during development. Apple’s Metal is also a Mantle-like low level API too, I suppose.
With all the new API’s in development at that point in time, I guess it did make sense for Apple to just focus on their own graphics API. But at the same time, I just mean that this is something they could have tried developing a decade earlier.
But I guess it makes sense now, if their want to unify their platforms. It works well for Microsoft across the boards with their products.
Oh, I agree with that. The API’s across the boards are somewhat similar but with their own dialects.
But in the long run, it still kind of sucks for legacy OpenGL applications. But then again, maybe there are some OpenGL to Metal wrappers out there that may get better performance than the OpenGL drivers found in current oSX? Maybe Apple can supply a good software layer for OpenGL legacy stuff?
I would wager that legacy openGL translated to Metal may actually run faster than the old openGL implementation…
I wouldn’t say Mantle “failed”. it was an experimental API as i understand it, and did actually have a few games ship full support for it including Battlefield 4 and Dragon Age Inquisition, to name two actually in my collection.
So it’s probably got a better history than “proper” DX12 support at this point… yet to see a DX12 title that is actually built for DX12 and not just DX12 in name but running pretty much the same stuff as it does in DX11.
I’d say Vulkan is essentially “Mantle 2.0” in everything but name. Saying mantle failed is like saying DX10 failed because everyone uses DX11 now.
I think of mantle as very successful. It’s no secret that AMD drivers lag years behind nvidia in terms of optimizations and API support. AMD’s solution to this was to push for lower level graphics APIs. These have simpler drivers and put a lot of the optimization burden on the game developers themselves.
It is thanks to mantle that we have the new low-level APIs. The way I see it AMD has reached their goal.