Couple of weeks ago. NSI decided to push some of their domains into CLIENT HOLD status, and that will cause DNS resolution to stop working for the domain.
Took down uh, well, everything: https://status.digitalocean.com/incidents/jm44h02t22ck
Couple of weeks ago. NSI decided to push some of their domains into CLIENT HOLD status, and that will cause DNS resolution to stop working for the domain.
Took down uh, well, everything: https://status.digitalocean.com/incidents/jm44h02t22ck
[Edit] I’ll have to see if I can find the video.
I can save you the time there, at least: https://youtu.be/hiwaxlttWow
Honestly, I’d contact their support and ask what their processes are and what timelines they give customers for a response/remediation before they take action.
Especially ask how they notify you, and how long they allow for a response before escalation to make sure that’s something you can actually get, read, and do something about within.
It might not be a great policy, but if you at least know what might happen, it gives you the ability to make sure you can do whatever you need to do to keep it from becoming a larger issue.
Everyone loves to hate on Cloudflare, but uh, duh, of course a US company will comply with a request under US law that they have to comply with?
If you don’t want your shit DMCAed, don’t use anything based in the US to provide it.
Go host somewhere that doesn’t have smiliar laws and won’t comply with foreign requests.
There was a recent video from everyone’s favorite youtube Canadians that tested how many USB devices you can jam onto a single controller.
The takeaway they had was that modern AMD doesn’t seem to give a shit and will actually let you exceed the spec until it all crashes and dies, and Intel restricts it to where it’s guaranteed to work.
Different design philosophies, but as long as ‘might explode and die for no clear reason at some point once you have enough stuff connected’ is an acceptable outcome, AMD is the way to go.
This new uh, tactic? of going after a registrar instead of a hosting provider with reports is a little concerning.
There’s an awful lot of little registrars that don’t have any real abuse department and nobody is going to do shit other than exactly this: take it down and worry about it next week when they have time.
It really feels like your choice of registrar is becoming as much or more important than your choice of hosting provider, and the little indie guys are probably the wrong choice if you’re running a legitimate business as you’re gonna need one that has enough funding and a proper team to vet reports before clobbering your site.
On the OTHER hand, Network Solutions is just took down DigitalOcean for no reason, so maybe they all suck?
I’m on year 5 with 6 of them and they’re all fine.
RTSP stream to frigate, and then frigate does the magic AI and recording shit.
They’re also not allowed outside the LAN and don’t seem to care about not being all internet connect-y, though YMMV on newer models.
I can’t think of a single case of being annoyed with them other than the mounting pressure is a little wonky and a sufficiently fat corvid can land on them and change the angle on one of the ones in the backyard but I’m not sure I’d blame the camera manufacturer because of a fat crow.
I mean not the first time they’ve sued over cheats, and they very much took a sweeping victory last time.
I’d expect the same DMCA circumvention provision along with the always fun “Well, literally everything you did is also a CFAA violation so maybe you want to settle now before we try to get you extradited to the US on federal felony charges” threat would result in pretty much the same outcome here.
I’m a big fan of using model paint, like you’d go buy for, well, models or your Warhammer stuff.
Small bottles, literally any color you could ever possibly want, and it’s easy to work with because it’s designed to be used on tiny little plastic things anyway.
This feels like both a bad argument from the people filing the suit, and a bad call from the judges.
Sure, you don’t have to use iCloud, but Apple has absolutely tied so much functionality - including automated backups - to it that, honestly, it’s de-facto required if you’re going to stay in the ecosystem and expect all the features that are listed on the side of the box to work.
And, of course 5GB is really not sufficient space to even reliably back up a modern iOS device, let alone file syncing, email, photos, messages, etc. at this point.
It feels like the people who brought the suit didn’t really have formulated a good argument (or even had reasonable standing - if you’re using the 5gb tier it’s hard to argue Apple force you to do anything), but I don’t think the general gist of ‘Apple is providing 5GB knowing you’re going to almost certainly going to be forced to upgrade’ is all that wrong.
Looks like others have provided MOST of the answers.
Radarr/sonarr do the heavy lifting making symlinks where symlinks are required, but there’s still the occasional bit of manual downloading.
I also have a script that’ll check for broken symlinks like once a week and notify me of them and I’ll go through and clean them up occasionally, but that’s not super common and only happens if I’m manually removing content I made manual symlinks for, since I’ll just let radarr/sonarr deal with it otherwise.
(The full stack is jellyseerr -> radarr/sonarr -> qbittorrent/sabnzb -> links for jellyfin)
Listen, nothing bad has ever come from someone randomly pushing buttons that are blinking, glowing, or happen to be red.
Never, not once, in the history of mankind has there ever been any regret from doing so, either.
Push ALL the buttons, occasionally repeatedly just in case it missed what you wanted to do the first time.
I just select the files I want from the bigger torrents, and then proceed to not touch it ever again, unless I want to add more stuff to the downloaded files.
I also don’t move things around - I’m on Linux so all the torrents live in one place with symlinks pointing to where I need/want the data to be as I figured out yeeeears ago that trying to manage a couple thousand active torrents while having the data spread everywhere is a quick trip to migrane town.
Quicksync
Yeah, it doesn’t sound like you’re transcoding in a way that’ll show any particular benefit from Quicksync over AMF or anything else. My ‘it’s better’ use case would be something like streaming to a cell phone at 3-5mbps, and not something local or just making a file to save on your device.
DDR4 and no ECC
That’s what my build is: 128gb of Corsair whatever on a 10850k. I’m sure there’s been some silent corruption somewhere in some video file or whatever, but, honestly, I don’t care about the data enough to even bother with RAID, let alone ECC.
I will say, though, if you’re going to delve into something like ZFS, you should probably consider ECC since there are a lot more ‘well shits’ that can happen than what I’m doing (mergerfs + snapraid).
power consumption
A $30 or whatever they are kill-a-watt plus something like s-tui running on the NAS itself to watch what the CPU is doing in terms of power states and usage. I’ve got a 8-drive i9-10850k under 60w at “idle” which is not super low power, but it’s low enough that the cost of hardware to improve on it even a little bit (and it’d be a very little bit) has a ROI period of longer than I’d expect the hardware to last.
If you’re going to be doing transcoding for remote users at lower bitrates, quicksync is still better than AMF, so I’d vote Team Intel.
If you’re not, then buy whatever meets your power envelope desires and price point.
For Intel, anything 8th gen or newer should be able to natively do anything you need in Quicksync, so you don’t need to head to Amazon and buy something new, unless you really want to.
Also, I’d consider hardware that has enough SATA ports for the number of drives you want so that you can avoid dealing with a HBA card: they inflate the power envelope of the system (if power usage is something you’re concerned with), and even in IT mode, I’ve found them to be annoyingly goofy at times and am MUCH happier just using integrated SATA stuff.
I could be entirely thinking of some other Nap or something; I do know that even on battery it’s not impossible for a Mac to wake up and get stuck awake because of a misbehaving app, though I’m probably wrong about which specific feature was responsible.
They’re probably safe, since they don’t emulate commercially viable platforms via EmulatorJS, but never hurts.
Well, I mostly was reading it as ‘I put a new battery in, then tossed it back in the closet.’ and was wanting to comment that adding the new battery isn’t a great idea, heh.
Anecdotally, but the only leaky batteries I’ve had are ones that I replaced with new ones, which is why I’m on team no batteries if you’re not using it and no batteries if it’s not absolutely required to make the system work.
I’m assuming it’s just a case of basically depreciated battery sizes being made by factories that aren’t putting in the time to do as good of QA as you might get from a more first-tier manufacturer, but whatever it is, even a shiny new battery is a risk to vintage stuff at this point.
Agree that you should probably replace the battery when you can, but you said this is only sometimes happening?
You might want to make sure that the laptop isn’t waking up due to power nap or that there’s not a process keeping it awake if it does.
I don’t have a battery-powered mac sitting around right now, but google should probably end up giving you useful directions for both of those.
I’ve done it twice!
I’ve always debated between it needing to be on my resume as an ‘Achievement’ or not.