For anyone looking to learn git, the official book and site are thorough and exceptional. You can even download the eBook for free. While there’s no harm in using other sources to learn git, don’t use them as an alternative to the canonical source.
For anyone looking to learn git, the official book and site are thorough and exceptional. You can even download the eBook for free. While there’s no harm in using other sources to learn git, don’t use them as an alternative to the canonical source.
I wonder if these trillion dollar companies offer support contracts for astroturfing on social media on their behalf. I can’t think of any other way so many people are supporting their sociopathic attitude.
The devs don’t take an issue with the ticket being filed. They’re irritated by one particular reply which sounds like “My million dollar product depends on this bug fix. Please do that for me”. MS isn’t offering a solution. They’re asking for one.
To be fair MS offers an amount for the fix. Most companies just bully the devs instead. However, I don’t think it’s quite fair (though legal) to offer one time payments for a core library that they use.
Those same companies tell you that their products that you paid for don’t belong to you. You are just buying a license to use them. Sadly, this asinine concept is spreading even to hardware markets.
I think it’s fair to ask them to take their own bitter pill. They should also invest without owning.
The hack is still not fully understood and is being analyzed. It doesn’t help that Github suspended everything, including the original maintainer’s account (who is believed to be a victim of social engineering).
Anyway, you will eventually see a post mortem. I’m willing to bet that it’s going to be as phenomenal as the hack itself. The case and its investigation is going to be a classic case study for all security researchers and security-minded users. Anyway, I doubt that the attackers will ever be found. Jia Tan, Jigar Kumar and others are going to remain as ghosts like Satoshi Nakamoto.
They really ought to have version masking like in Gentoo portage.
Peter Thiel is insolent enough to say out loud what these companies practice - ‘competition is for losers’. These quasi-monopolies aren’t here to provide the best value - quite the opposite. They want to kill all competition by any dirty tactic and then use the diminished choice to wring the customers of every penny they have. They want to extract maximum revenue by making sure that their inferior solution is the only option customers have.
This problem isn’t solvable by market regulation alone. The world has enough a*****es around who will climb to the top of successful companies and find ways around the regulations. They’re being as bad as they can, while skirting the limits of what’s illegal. My main gripe is with the engineers, programmers, technicians and all technical creators who enable these scumbags. It’s not hard to see that supporting a proprietary solution amounts to yielding the consumers’ bargaining power to a monopoly. Despite that, they keep making these choices. For example, it’s not uncommon to hear senior engineering managers or technical-lead level employees saying, “I know that Chrome is spyware and I want to quit it. But this <stupid-webservice-at-office> works only on Chrome”. I feel like screaming at them that if they’re too incompetent to demand a change at the level they’re at, they’re in the wrong profession.
If you’re a technical creator, your choices matter. It affects a lot more people than you alone. But more often than not, I see such creators surrendering principles in exchange for convenience. They hold as much responsibility as the market-abusers in making the world the way it is now.
Interesting that they started dictating what you can and can’t do with YOUR program! Consumer rights are a joke to these quasi-monopolies.
CUDA is an API to run high performance compute code on Nvidia GPUs. CUDA is proprietary. So CUDA programs run only on Nvidia GPUs. Open alternatives like vulkan compute and opencl aren’t as popular as CUDA.
Translation layers are interface software that allow CUDA programs to run on non-Nvidia GPUs. But creating such layers require a bit of reverse engineering of CUDA programs. But they are prohibiting this now. They want to ensure that all the CUDA programs in the world are limited to using Nvidia GPUs alone - classic vendor lock-in by using EULA.
What’s ironic is that rebases aren’t as hard as many consider it to be. Once you’ve done it a couple of times, you just do it everyday as easily as you commit changes.
I find myself passing copies of values around and things like that, it might be that the compiler just takes care of that,
Rust prefers explicitness over magic. So it does what you tell it and doesn’t just take care of that.
If you’re copying a lot of values around (I.e cloning. Not moving or borrowing), then you’re definitely doing it inefficiently. But you don’t have to worry too much about that. If there are too many difficulties in borrowing, it may be because those borrows are problematic with respect to memory safety. In such cases, sacrificing performance through cloning may be an acceptable compromise to preserve memory safety. In the end, you end up with the right balance of performance (through borrowing) and safety (through cloning). That balance is hard to achieve in C/C++ (lacking in safety) or in GC languages (lacking in performance).
If that’s the friction you’re facing in Rust, then I would say that you’re already in a good position and you’re just trying too hard.
they don’t feel like your fighting the language
I really understand what you mean wrt Rust. I really do - I was there once. But it’s a phase you grow out of. Not just that - the parts you fight now will eventually become your ally.
and let me feel sort of creative in the way I do things
I had the same experience with C/C++. But as the design grows, you start hitting memory-safety bugs that are difficult to avoid while coding - even after you learn how those bugs arise in the first place. Just a lapse of concentration is enough to introduce such a bug (leaks, use-after-free, deadlocks, races, etc). I’ve heard that C++ got a bit better after the introduction of smart pointers and other safety features. But, it comes nowhere near the peace of mind you get with garbage collected languages.
That’s where Rust’s borrow checker and other safety measures kick in. The friction disappears when you acquire system knowledge - concepts of stack, heap, data segment, aliasing, ownership, mutation, etc. These knowledge are essential for C/C++ too. But the difference here is that Rust will actually tell you if you made a mistake. You don’t get that with C/C++. The ultimate result is that when a Rust program compiles successfully, it almost always works as you expect it to (barring logical errors). You spend significantly less time debugging or worrying about your program misbehaving at runtime.
The ‘friction’ in Rust also helps in another way. Sometimes, you genuinely need to find a way out when the compiler complains. That happens when the language is too restrictive and incapable of doing what you need. You use things like unsafe
, Rc
and Refcell
for that. However, most of the time, you can work around the problem that the compiler is indicating. In my experience, such ‘workarounds’ are actually redesigns or refactors that improve the structure of your code. I find myself designing the code best when I’m using Rust.
but I disagree wholly that it’s the language’s fault that people can exploit their programs. I’d say it’s experience by the programmer that is at fault, and that’s due to this bootcamp nature of learning programming.
Considering that even the best programmers in the world can’t write correct programs with C/C++, it’s wrong to absolve those languages of the massive level of memory safety bugs in them. The aforementioned best programmers don’t lack the knowledge needed to write correct programs. But programmers are just humans and they make or miss serious bugs that they never intended. Having the computing power to catch such bugs and then not using it is the real mistake here. In fact, I would go one step further and say that it isn’t the language’s fault either. Such computing power didn’t exist when these languages were conceived. Now that it does, the fault lies entirely with the crowd that still insist that there’s nothing wrong with these old languages and that these new languages are a fad.
If I were, I wouldn’t be hanging out here, would I? BTW, I use both Vim and Emacs.
More like a personal bias in the form of a distasteful snark that the author thinks is funny. Their demonstrated knowledge about Emacs in the article indicates the worth of such remarks.
They’re a trillion dollar company acting like a petulent child
No. They’re a trillion dollar company acting like a greedy dirty scum that they are.
You think that’s going to convince them? Plenty of people consider Apple as the second coming of the messiah. They would cheer if Apple dropped a bucket load of crap on their desk.
That sounds like a … good thing?
PS: I didn’t realize wood working and gardening are therapeutic. Sounds like a wonderfully productive solution to a pesky problem.
I had the fortune of being the trainer for my company in all things git. I made sure that my colleagues (most of whom were straight out of universities) were introduced to git CLI and git concepts. No git GUIs were introduced. Consequently, the mess they made was easy to rectify. And then I occasionally read about horror stories like these where GUIs are allowed.
Gitlab is very complex and a heavy resource hog. You probably don’t need it. Most small to medium enterprises can comfortably host their projects on lightweight forgejo or gitea (speaking from experience). They even have functionality similar to github actions. If you need anything more complex, you are better off integrating another self hosted external service to the mix.