You have imposter syndrome so bad you don't even think you're a real imposter!
You have imposter syndrome so bad you don't even think you're a real imposter!
Wait until this guy finds out that Elon doesn't actually build the cars
Three things off the top of my head:
5 years ago everything was moving to TypeScript. Now everything has moved. Developers are still catching up, but it will be one-way traffic from here.
I'm guessing your manager thinks TypeScript is like CoffeeScript. It is not like CoffeeScript.
Also, TypeScript is only the beginning. In the halls of the tech giants most devs view TypeScript as a sticking plaster until things can be moved to webassembly. It will be a long time until that makes any dent in JS, but it will also be one-way traffic when it does.
I mean, let's be real, Rust is really called out because it causes high drama between C devs and Rust advocates, which drives engagement.
It's probably all kicking off in about 10 different comment sections right now
“As we’ve said, we’re responsibly investing in our company’s biggest priorities and the significant opportunities ahead,” said Google spokesperson Alex García-Kummert. “To best position us for these opportunities, throughout the second half of 2023 and into 2024, a number of our teams made changes to become more efficient and work better, remove layers, and align their resources to their biggest product priorities. Through this, we’re simplifying our structures to give employees more opportunity to work on our most innovative and important advances and our biggest company priorities, while reducing bureaucracy and layers”
There was this incredible management consultant in france in the 18th century. Name eludes me, but if he was still around Google could hire him and start finding some far more convincing efficiencies.
The guy was especially good at aligning resources to remove layers
eh, more like self-important plumbers
Counter-point: Atom is terrible. Its electron competitors are terrible. Big IDEs are terrible. Simple text editors are terrible.
If you are under 50 and chose to learn vim or emacs, there is a 100% chance that you were also forced to learn latin at school and honestly it's not your fault that you turned out this way.
These are all the options. Sometimes all the options are terrible.
How did they ask all these random people and not bother to ask a single software engineer?
"Hi is this excuse real, or is it just a sign of an inappropriate relationship between the local council and a dodgy software company that pays more dividends than developers? Oh it's the latter? Okay, thanks."
I use a Tokonami KX450, which is not the newest but it's the most widely available military-grade model that the average silicon shop is able to customise.
With that in mind you'll want a uranium microreactor to really get that turbo button cranking out the keycodes (the french stuff is cheapest but ukrainian kit is worth the extra), as well as a mercury cooling solution and ideally a set of maglev keys for all the most common letters (NOT backspace; frankly you should remove that key entirely to avoid habits that damage your WPM).
Assuming you've got a solid pair of high-torque power gloves that should get you up to at least 20000 WPM, which admittedly won't cut it if you're trying to keep all the NPM dependencies up to date in a modern bank's transaction processing software, but it's probably enough if you're just doing a bit of data analysis in python.
Even following the guidelines, modern C++ is just a huge pile of half-finished ideas. It goes pretty well for the first few hundred lines of code, and then you hit a very basic problem where the solution is "yes this will work great in C++26, if the proposal doesn't get delayed again".
I'm still confused by this not being cross-platform. It's made in Rust; basically every graphics library is cross platform out of the box, and so is all the file IO stuff. There will be some specialist OS api stuff in places but surely it can't be much.
For once this comment isn't even snark. I acknowledge my ignorance and wonder if someone could explain why the cost is bigger than I think?
Perhaps it's setting up CI and packaging for other platforms? Maybe they want human QA on every release? Maybe the APIs for slick OS integration are more complicated than I realise? (e.g. putting UI in the taskbar)
They are not stupid at all. Their interests are in conflict with the interests of tech workers and they are winning effortlessly, over and over again.
The big tech companies are all owned by the same people. If these layoffs cause google to lose market share to another company, it's fine because they own that company too.
What matters is coordinating regular layoffs across the whole industry to reduce labour costs. It's the same principle as a strike: if the whole industry does layoffs, workers gradually have to accept lower salaries. In other words, the employers are unionised and the employees are not.
This process will probably continue for the next 20 years, until tech workers have low salaries and no job security. It has happened to countless industries before, and I doubt we are special.
I'm sure the next big industries will be technology-focused, but that's not the same as "tech". They won't involve people being paid $200k to write websites in ruby.
SIGH. Capitalism is a fringe conspiracy theory. Next you'll be claiming that billionaires earn their money through "capital gains" instead of salary, or that every corporation answers to a shadowy cabal of "shareholders" who only care about profit.
Well you won't fool me. Unlike you, I have educated myself by reading newspapers.
Literally any logic programming language
The specifics of C's design could barely be less important. In the 70s it was one of countless ALGOL derivatives churned out on-demand to support R&D projects like Unix.
Unix succeeded, but it could have been written in any of these languages. The C design process was governed by the difficulty of compiler implementation; everyone was copying ALGOL 68 but some of the features took too long to implement. If Dennis Ritchie had an extra free weekend in 1972, C might have a module system. But he didn't, so it doesn't.
If I went back to the early days of C, it would be because John Connor sent me there in a time machine to destroy the first compiler before it could become self-hosting
Rewriting bits of the kernel makes sense. I can't imagine them porting much C# to Rust though, beyond very small, self-contained services.
Everyone likes a dramatic headline, but in my estimation there is 0% chance of Microsoft pushing widespread Rust adoption over C#.
In the long-term I'd guess they are more likely to continue extending C# with features that make it possible to optimise hot loops. They already added NativeAoT and ref structs, and they have done a lot of research into memory regions and capabilities (an alternative to Rust's affine types).
Eventually it may be possible to opt into a clunky language subset that gives Rust-like performance without giving up memory safety.
They are also quite likely to use OS-level intervention to safely sandbox C++ code inside a .NET process without giving up performance. They've done a lot of research on this, and now they can steal notes from webassembly too.
I swear, Zoomers are like the steve buscemi "fellow kids" meme, but somehow everyone in the scene is young
Anyway, nice compiler. Might feel basic to you, but writing a back end for a low level IR format is not that much harder.
I don't need this level of introspection on a tuesday morning
Here's one I learned from a past manager:
Stress that everyone needs to pitch in and make themselves useful at all times, but do not share any information at all
Make sure the work is not broken down into clear tasks. Make sure nobody else has access to the stakeholders. Make people ask separately for every single account or access credential they need, and respond with incredulity that they don't already have it.
Give the impression that there are no processes. When someone submits work, criticise them for not following the process.
Each day, schedule meetings so you are impossible to contact until the early afternoon. That way you can interrupt any request for information by asking the person what work they did in the morning. The goal is to close the loop by making people scared to talk to you, so they blame themselves for not knowing anything.
He won computer-science boffinry's highest possible gong
why must british journalists write like this? even in an obituary!
Why would you discourage interesting, original journalism over such an obtuse nitpick?
They are clearly criticising the same capitalist structures that you are. They single out the tech industry because the article is about the misuse of tech, not because they think rank and file tech workers are deviants.
Frankly it comes off as fragile and dismissive, and if that's what we're doing we could have just stayed on reddit.
Prolog, Mercury, Datalog. Very of intrigued by Verse now that I know it has some logic programming features.
Mercury is, roughly, a fusion of Haskell and Prolog. Bizarre and fascinating.
Prolog and Datalog are great but not aimed at general purpose programming.
Really I just want to see more people trying to adapt ideas from logic programming for general purpose use. Logic programming feels truly magic at times, in a way that other paradigms do not (to me at least).
You could write a compiler for a low-level language in anything. Honestly makes little sense that most people do it in C++ when they're only going to replace it anyway.
you have been looking at my private github repos and are clearly a witch
I don't think TIOBE rankings mean much, but C# is genuinely doing quite well.
In the games industry it has become the only language that is commonly used both for entry-level gameplay scripting and for implementing game engines.
It still has a lot of overhead compared to languages like C++ and Rust, but has proven to be a workable alternative for games that aren't straining against the limits of modern hardware.
The github blurb says the language is comparable to general purpose languages like python and haskell.
Perhaps unintentionally, this seems to imply that the language can speed up literally any algorithm linearly with core count, which is impossible.
If it can automatically accelerate a program that has parallel data dependencies, that would also be a huge claim, but one that is at least theoretically possible.
If I understood correctly, the closest thing I know of to what you are describing is probably Terra:
It is an academic project with various papers presenting case studies that do things like change the whole programming paradigm the language, or the execution model, or the syntax.
The wider paradigm is called multi-stage programming. The other obvious languages to mention are the lisp family, and more recent spin-offs like Julia.
The real question is how we got any portable solution to this problem in the first place.
To me problems are fundamentally economic and political, not technical. For example, the unique circumstances that led to a portable web standard involved multiple major interventions against Microsoft by antitrust regulators (in 2001, 2006, 2009, 2013, etc). The other tech giants were happy to go along with this as a way to break microsoft's monopoly. Very soon after, Google and Apple put the walls straight back up with mobile apps.
If you go back before HTML, OS research was progressing swiftly towards portable, high-level networked GUI technology via stuff like smalltalk. Unfortunately all of the money was mysteriously pulled from those research groups after Apple and Microsoft stole all the smalltalk research and turned it into a crude walled garden of GUI apps, then started printing money faster than the US Mint.
Whenever you see progress towards portable solutions, such as Xamarin and open source C#, React Native, or even Flutter, it is usually being funded by a company that lost a platform war and is now scrambling to build some awkward metaplatform on top of everyone else's stuff. It never really works.
One exception is webassembly, which was basically forced into existence against everyone's will by some ingenious troublemakers at Mozilla. That's a whole other story though!
Most common reason for being bad at programming is finding it boring and thinking the tools are needlessly obtuse.
If anything this is a sign of great intellect.
But why? I saw their section about rasterising everything on the GPU, and again I find it hard to understand why they need anything more than OpenGL 3.
Their UI looks like it would be an LRU glyph cache and a sprite batcher, and then you'd have 1000fps.
They are both doomed because neither is transformative enough to justify adoption. They are going to need to solve much harder problems to do that.
Take Rust as an example. It solved a problem that most people weren't even paying attention to, because the accepted wisdom said it was impossible.
I learned through three things:
The first two things helped me understand how common code constructs are translated to assembly, so I can do a rough projection in my head when skimming a C function. Nowadays you can get quite far just by playing around on godbolt.
The third thing helps surface the less visible aspects of CPUs. After learning how a few low-level optimisations work, all the principles and explanations start to repeat, and 90% of them apply to every modern architecture. You can set out with specific high-level questions, like:
Very quickly you'll find lots of insightful articles and comments explaining things like CPU caching, prefetching, branch prediction, pipelining, etc.
I have no book recommendations for you. I've found all the best information is freely online in blogs and comment sections, and that the best way to direct my learning is to have a project (or get employed to do low-level stuff). Might be different for you though!
Lol okay. Here are some concrete examples I don't have:
std::expected
eventually approved, similar to Rust's Result type, but with no equivalent to the '?' operator to make the code readablestd::expected
has even stabilised, but will probably not be available for 10 yearsauto
variable in a template, and instead of giving a helpful type error it implicitly coerced a new copy of my vast memory buffer into existenceYou can argue about productivity and “progress” all you like, but none of that will raise you back into my good opinion.
Why would you quote this and then immediately argue about productivity and progress?
Thanks for clarifying. Do you know what the bug is? I gathering it's something to do with enums and boxing to enable a "safe" transmute, but couldn't make sense of the code.
Scary compiler edge cases with value-type enums aren't surprising though. Trying to implement that feature with memory safety and high performance sounds like a nightmare.
That sounds plausible. I'm sure having the lowest possible latency was their goal. There are multiple popular rust libraries aiming to provide zero cost abstractions over a common subset of the metal, vulcan and dx12 APIs, but I've never actually used one.
Mojo's starting point is absurdly complex. Seems very obviously doomed to me.
Julia is a very clever design, but it still never felt that pleasant to use. I think it was held back by using llvm as a JIT, and by the single-minded focus on data science. Programming languages need to be more opportunistic than that to succeed, imo.
Also somewhere in the middle:
"There's been no report of this for a while so we're marking it resolved."