We are digital librarians. Among us are represented the various reasons to keep data -- legal requirements, competitive requirements, uncertainty of permanence of cloud services, distaste for transmitting your data externally (e.g. government or corporate espionage), cultural and familial archivists, internet collapse preppers, and people who do it themselves so they're sure it's done right. Everyone has their reasons for curating the data they have decided to keep (either forever or For A Damn Long Time). Along the way we have sought out like-minded individuals to exchange strategies, war stories, and cautionary tales of failures.
We are one. We are legion. And we're trying really hard not to forget.
SnapRAID doesn't compute the parity in real time, so there's this window between making a change to the data and syncing where your data isn't protected. The docs say
Here’s an example, you acquire a file and save it to disk called ‘BestMovieEver.mkv’. This file sits on disk and is immediately available as usual but until you run the parity sync the file is unprotected. This means if in between your download and a parity sync and you were to experience a drive failure, that file would be unrecoverable.
Which implies that the only data at risk is the data that's been changed, but that doesn't line up with my understanding of how parity works.
Say we have three disks that store 1 bit of information and a parity drive: 101 parity 0. If we modify the data in the first disk (data 001 parity 0), then the data is out of sync. Say we now lose disk 2 (data 0?1 parity 0). How does it then recover
I'm thinking of backing all of my family's digital assets up. It includes less than 4 TB of information. Most are redundant video files that are in old encodings or not encoded at all and there are a lot of duplicate images and old documents. I'm gonna clean this stuff up with a bash script and some good old manual review, but first I need to do some pre-planning.
What's the cheapest and most flexible NAS I can make from eBay or local? What kind of processors and what motherboard features?
What separate guides should I follow to source the drives? What RAID?
What backup style should I follow? How many cold copies? How do I even handle the event of a fire?
I intend to do some of this research on my own since no one answer is fully representative but am appreciative of any leads.
My partner's grandmother has passed and has left a collection of hundreds possibly thousands of DVDs. These range from official releases to pirated and bootleg copies.
What would be the best way to digitize and archive this collection? Is there an external device out there that will let me burn and convert the DVDs? I'd want to possibly upload on archive.org if the copyright expired, store on backblaze or maybe another digital archiving site besides a regular torrent, would appreciate any recs on sites and advice in general. I haven't gone through these yet but figure the project would be a fun learning experience.
Hi there, I've been meaning to go get more serious about my data. I have minimal backups, and some stuff is not backed up at all. I'm begging for disaster.
Here's what I've got:
2 8tb drives almost full in universal external enclosures
A small formfactor PC as a server, with one 8tb drive connected.
An unused raspberry pi.
No knowledge of how to properly use zfs.
Here's what I want:
I've decided I don't need raid. I don't want the extra cost of drives or electricity, and I don't need uptime. I just need backups.
I want to use what drives I have, and an additional 16tb drive I'll buy.
My thought was that I would replace the 8tb drive with a 16tb one, format it with zfs (primarily to avoid bit rot. I'll need to learn how to check for this), then back it up across the two 8tb drives as a cold backup. Either as two separate drives somehow? Btrfs volume extension? Or a jbod connected to the raspberry pi, that I leave unplugged except for when it's time to sync the new data?
Archive Team has just begun the distributed archiving of the Japanese SS Blog, a blog hosting service, which is set to be discontinued on March 31, 2025.
And you can help! There isn't much time left, so as many people running the warrior as possible is needed.
The tracker (at the top of the page) has the simplest info on how you can help out
The github page offers a docker-based alternative for advanced users, and more info on best practices for this sort of archiving
Why help out?
The web is disappearing all the time, and often a lot of previously easily accessible information is lost to time. These japanese blogs may not be very important to you, but they ce
This is the first time I am uploading patched Roms, unlike previously where I uploaded only the patch files. My personal collection of Super Nintendo Romhacks in ready to play patched Roms in .sfc and .smc formats, complete with a descriptive text document. Most, if not all, files are patched by myself, but I did not test every game yet. Some old Romhacks do not work in accurate emulators.
Please share this everywhere where Rom files are allowed to be shared. I am only sharing here at the moment.
This collection comes in two variants: flat structure, and sub structure.
"flat" just means all Roms and documents are saved in one single directory.
"sub" means, every game got its own dedicated directory, where only related
Romhacks and Mods are saved.
For years I've on and off looked for web archiving software that can capture most sites, including ones that are "complex" with lots of AJAX and require logins like Reddit. Which ones have worked best for you?
Ideally I want one that can be started up programatically or via command line, an opens a chromium instance (or any browser), and captures everything shown on the page. I could also open the instance myself and log into sites and install addons like UBlock Origin. (btw, archiveweb.page must be started manually).
I've been thinking about picking up an N150 or 5825U MiniITX board for a NAS, but I'm wondering if there are better options given my requirements.
At least 2x 2.5Gb LAN
A 10Gb LAN, or 2.5Gb if not
2x NVME
8x SATA for spinning disks
2x SATA for SSDs
MiniITX is required for the 10" rack
64+ Gigs of RAM (ZFS cache) (This is not possible on an N150)
The problem I'm running into with the boards I've looked at is PCIe lanes, and not having ways to expand the sata or network ports without stealing from NVME.
I've started to look at boards with PCIe 4.0x16 slots and risers/splitters for expansion, but then I can't find low power CPUs for them.
Checklist I'm reporting that yt-dlp is broken on a supported site I've verified that I have updated yt-dlp to nightly or master (update instructions) I've checked that all provided URLs are playabl...
A team of volunteer archivists has recreated the Centers for Disease Control website exactly as it was the day Donald Trump was inaugurated. The site, called RestoredCDC.org, went live Tuesday and is currently being hosted in Europe.
As we have been following since the beginning of Trump’s second term, websites across the entire federal government have been altered and taken offline under this administration’s war on science, health, and diversity, equity, and inclusion. Critical information promoting vaccines, HIV care, reproductive health options including abortion, and trans and gender confirmation healthcare have been purged from the CDC’s live website under Trump. Disease surveillance data about bird flu and other concerns have either been delayed or have stopped being updated entirely. Some deleted pages across the government have at least temporarily been restored thanks to a court order, but the Trump administratio
I have been lurking on this community for a while now and have really enjoyed the informational and instructional posts but a topic I don't see come up very often is scaling and hoarding. Currently, I have a 20TB server which I am rapidly filling and most posts talking about expanding recommend simply buying larger drives and slotting them in to a single machine. This definitely is the easiest way to expand, but seems like it would get you to about 100TB before you cant reasonably do that anymore. So how do you set up 100TB+ networks with multiple servers?
My main concern is that currently all my services are dockerized on a single machine running Ubuntu, which works extremely well. It is space efficient with hardlinking and I can still seed back everything. From different posts I've read, it seems like as people scale they either give up on hardlinks and then eat up a lot of their storage with copying files or they eve
Just noticed this today - seems all the archiving activity has been noticed by NCBI / NLM staff. Thankfully most of SRA (the Sequence Read Archive) and other genomic data is also mirrored in Europe.
Attached: 1 image
If you got some bandwidth to share and want to help archive the web, set up an ArchiveTeam Warrior docker. It will download stuff and upload it to the Internet Archive. They and I recommend to use "ArchiveTeam's Choice", but you can set it work on specific projects like "US Govern...
I set up an instance of the ArchiveTeam Warrior on my home server with Docker in under 10 minutes. Feels like I'm doing my part to combat removal of information from the internet.
In light of some of the recent dystopian executive orders, a lot of data is being proactively taken down. I am relying on this data for a report I'm writing at work, and I suspect a lot of others may be relying on it for more important reasons. As such, I created two torrents, one for the data behind the ETC Explorer tool and another for the data behind the Climate and Economic Justice Screening Tool. Here's an article about taking down the latter. My team at work suspects the former will follow soon.
Here are the .torrent files. Please help seed. They're not very large at all, <300 MB.
Draft Digital Personal Data Protection Rules, 2025 may lead to deletion of deceased individuals' social media accounts.
Link Actions
This is bad, like very bad. The proposed draft law in India, in its current form only prescribes deletions and purges of inactive accounts when the users die. There should be a clause where archiving or lock/suspension (like Facebook's memorialization feature) are described as alternative methods to account deletion.
If the law as it is is pushed through and passed by the legislature the understanding of the past will be destroyed in the long term, just like how the fires in LA have already did to the archives of the notable composer Arnold Schoenberg.
If you're an Indian citizen you can go to this page to post your feedback and concerns.