@bastianallgeier did you download the extended backup too? Lots of interesting information in there, see waterpigs.co.uk/articles/data-export#facebook-extended
@bastianallgeier did you download the extended backup too? Lots of interesting information in there, see waterpigs.co.uk/articles/data-export#facebook-extended
Uh-oh, markdoesntunderstandanimals.com has gone offline. Time to crank up wget, pointed at markdua.tumblr.com #sitedeath
If you want to maintain a personal copy too, here’s the command:
wget -r http://markdua.tumblr.com -D 25.media.tumblr.com,markdua.tumblr.com -H -w 1
I was going to spend this evening working on #webactions browser extension, but I think it would be better spent providing a #backup/data export utility for fellow ex #diaspora users.
From my initial researches, it looks like /u/username.json
is the best bet, as it gives a JSON array of all posts written by username
, along with like and comment data. It accepts a max_time=timestamp
query param, and a _
query param, the function of which I am not sure of.
To iterate through all the pages of posts from a certain user, start with their profile URL w/ .json
tacked on the end, fetch all the items, get the datetime of the last item, convert that to a timestamp, fetch the same URL with ?max_time=timestamp
, repeat until an empty array is returned.
Salvaged three laptop hard dives — almost 1TB of storage for the price of some enclosures.