• 0 Posts
  • 4 Comments
Joined 1 year ago
cake
Cake day: June 15th, 2023

help-circle
  • I sort of agree with you, but not in the way I think you meant it.

    Vista’s problem was that it’s hardware requirements were too high for it’s time. Operating systems have very long project development lifecycle and at a point early on they did a forward looking estimate of where the PC market would be by the time Vista released, and they overshot. When it was almost ready to release it to the world Microsoft put out the initial minimum and recommended specs and PC sellers (Dell, HP, Gateway) lobbied them to lower the numbers; the cost of a PC that met the recommended specs was just too high for the existing PC market and it would kill their sales numbers if they started selling PCs that met those figures. Microsoft complied and lowered the specs, but didn’t actually change the operating system in any meaningful way - they just changed a few numbers on a piece of paper and added some configurations that let you disable some of the more hardware intensive bits. The result was that most Vista users were running it on hardware that wasn’t actually able to run it properly, which lead to horrible user experiences. Anyone that bought a high end PC or built one themselves and ran Vista on that, however, seemed quite happy with the operating system.


  • Unity did a bad thing, but the stock sale here is a complete non-event.

    According to Guru Focus, Unity CEO John Riccitiello, one of the highest-paid bosses in gaming, sold 2,000 Unity shares on September 6, a week prior to its September 12 announcement. Guru Focus notes that this follows a trend, reporting that Riccitiello has sold a total of 50,610 shares this year, and purchased none.

    He receives and sells stock constantly, as do most execs of publicly traded companies. Their compensation is majority stock, which incentivizes them to maximize stock prices since a higher price means more money RIGHT NOW for them. Look up any publicly traded company and peek at their insider trading info. Microsoft as a random reference and here’s Unity so you can see everyone else and the long term trends.

    The piece cites Guru Focus as their source of this info as if they have some keen inside information or something, but it’s literally public data that anyone with an internet connection can look up as these sorts of notices are required for publicly traded companies. Riccitiello only sold about $83k worth of stock before the announcement for a total of about $1.1M worth of stock this year, vs about $33M last year, and close to $100M in 2021. The idea that he dumped $83k worth of stock to beat bad news Unity was dropping is just a hilariously bad take.



  • AI resume screeners are very much at risk of bias. There have been stories about exactly this in years past. The ML models need to be trained, so they get fed resumes of candidates that were hired and not hired so the model can learn to differentiate the two and make decisions on new resumes in the future. That training, though, takes any bias that went into previous decisions and brings it forward.

    From the Amazon I linked above, the model was prioritizing white men over women and people of color. When you think back to how these models were trained, though, that’s exactly what you’d expect to happen. No one was intentionally introducing bias to the AI process, but software teams have historically been very male and white, and when referrals and references come into play, those demographics were further emphasized. And then let’s not pretend that none of those recruiters or hiring managers were bringing their own bias to the table.

    If you feed that into your model as it’s training data, of course the model is going to continue to favor white men, not because it’s actually looking for men, but because resumes that men typically submit are the kinds that get hired. Then they found that resumes that mention a professional women’s organization or historically black or women only colleges were typically not hired. The model isn’t “thinking” about why that is - it just knows that when certain traits exist, the resume is ranked lower, so it replicates that.

    Building a truly unbiased AI system is actually incredibly difficult, not the least due to the fact that the demographics of the data scientists working on these systems are themselves predominantly male and white themselves. We’ve also seen this issue in the past with other AI systems, including facial recognition systems, where these systems built by teams of white men can’t seem to make reliable determinations when looking at a picture of a black woman (with accuracy rates 20-30% lower for black woman compared to white men).