?

Log in

ljdump Users Group's Journal
 
[Most Recent Entries] [Calendar View] [Friends]

Below are the 20 most recent journal entries recorded in ljdump Users Group's LiveJournal:

[ << Previous 20 ]
Thursday, February 2nd, 2017
12:52 pm
[tcb]
Comments broken again?
Hullo!

I made the mods calmingshoggoth suggested after ljdump barfed after entry 1000.  Then it processed all entries, but was unable to grab the comments;

Fetching journal entry L-1494 (update)
Fetching journal comments for: tcb
*** Error fetching comment body, possibly not community maintainer?
*** not well-formed (invalid token): line 111, column 15
Fetching userpics for: tcb
1434 new entries, 0 new comments
Any ideas?
Sunday, January 29th, 2017
11:53 am
[aoeui21]
how to dump other user's ljournal? recieving Don't have access to requested journal error
hi.

i'm trying to dump ljournal of other user:
i specify user-name1 and password1 in the "ljdump.config",
but set journal=otheruser2.

when i run the ljdump.py, the error appears:

Fetching journal entries for: otheruser2
Traceback (most recent call last):
  File "./ljdump.py", line 368, in 
    ljdump(server, username, password, e.childNodes[0].data)
  File "./ljdump.py", line 175, in ljdump
    }, Password))
  File "/usr/lib/python2.7/xmlrpclib.py", line 1243, in __call__
    return self.__send(self.__name, args)
  File "/usr/lib/python2.7/xmlrpclib.py", line 1602, in __request
    verbose=self.__verbose
  File "/usr/lib/python2.7/xmlrpclib.py", line 1283, in request
    return self.single_request(host, handler, request_body, verbose)
  File "/usr/lib/python2.7/xmlrpclib.py", line 1316, in single_request
    return self.parse_response(response)
  File "/usr/lib/python2.7/xmlrpclib.py", line 1493, in parse_response
    return u.close()
  File "/usr/lib/python2.7/xmlrpclib.py", line 800, in close
    raise Fault(**self._stack[0])
xmlrpclib.Fault: <Fault 300: "Client error:  Don't have access to requested journal: You don't have access">


is it able to dump (i suppose only public) entries of lj of other user?
i use just downloaded ljdump.py Version 1.5.1 on Ubuntu 16.10
Wednesday, January 4th, 2017
7:01 am
[calmingshoggoth]
I had some problems with ljdump version 1.5.1 and fixed them
The first problem was that LJ limits you to 1000 fetches per hour. I made the loop sleep for four seconds (60*60/1000 is 3.6, so I rounded up) between fetches and it doesn't seem to have that problem anymore.

Then I ran into a problem with comments. The first comment id in the journal I am backing up is in the mid four thousands. When it fetches with an id of 1 it got back an empty set of comments (<comments></comments>). It then looped endlessly because it never changed the maxid. I changed the value in the .last file to be 4000 and it fetched the first thousand or so comments, but then there was another big gap in the comment ids (it jumped up to seven thousand something) and it again got stuck in an infinite loop.

Looking at the code I noticed that all of the ids are present in the comment.meta file, so I changed the code to grovel through that data structure instead of just blindly using maxid + 1 as the next id.

Here is the diff containing my changes:

Read more...Collapse )
Monday, May 30th, 2016
12:53 pm
[fpoling2]
ljdump port to Go
ljdumpgo - port of ljdump.py to Go, https://github.com/ibukanov/ljdumpgo.

I figured out that my Python skills are not that good to port the code to Python3, so I ported it to golang.
Wednesday, May 14th, 2014
10:32 am
[eugene_ivanov]
Fault 402: Client error: Your IP address is temporarily banned for exceeding the login failure rate
how to cure it?

1000 record this error occurs.

Is it possible to connect a proxy?
Tuesday, May 13th, 2014
8:26 pm
[eugene_ivanov]
Bug. String and number typing
I found bug.

In my post is text with +, but in xml created with your program, + is lost...

field

example, text of post is "+887878"

but in xml will be
887878

without + !!!

why?

i think, bug in xmlrpclib

because in this point in function "dumpelement"

s = unicode(str(e[k]), "UTF-8")

e[k] is NUMBER! not string! number without + of cause

how fix it?

thanks

ps.

as a temporary solution, made ​​and tested by downloading entry through the editor. it is also used when there is a record in the embed, it takes the entire embed instead of just useless links.

print "Fetching journal entry %s (%s)" % (item['item'], item['action'])
try:
    e = server.LJ.XMLRPC.getevents(dochallenge(server, {
        'username': Username,
        'ver': 1,
        'selecttype': "one",
        'itemid': item['item'][2:],
        'usejournal': Journal,
    }, Password))
    if e['events']:

        #--------------added by EI 20140503
        i = e['events'][0]['ditemid']
        tt = e['events'][0]['event']

        tt = unicode(str(tt), "UTF-8")

        ro = re.compile('lj-embed', re.M | re.S | re.U)
        n_ro = re.compile('^\d+$', re.M | re.S | re.U)

        m = re.search(ro, tt)
        is_number = re.search(n_ro, tt)

        if m or is_number:
            rr = int(item['item'][2:])

            r = urllib2.urlopen(urllib2.Request(Server+"/editjournal.bml?journal=%s&itemid=%d%s" % (Journal, i, authas), headers = {'Cookie': "ljsession="+ljsession}))
            meta = r.read()
            r.close()

            ro = re.compile('<textarea[^>]+id="body"[^>]+>(.*?)</textarea>', re.M | re.S | re.U)
            m = re.search(ro, meta)
            if m:
                e['events'][0]['event'] = str(m.group(1))
                e['events'][0]['event'] = saxutils.unescape(e['events'][0]['event'], {'"':'"'})
        #-----------------
        writedump("%s/%s" % (Journal, item['item']), e['events'][0])
Monday, December 16th, 2013
5:44 pm
[gyve]
First attemt at Python 3 port
Here it goes: http://pastebin.com/Q35EmMyY

It doesn't use this part because of some sort of byte/str serialization error happening I don't know a thing about:

f = codecs.open("%s/comment.meta" % Journal, "w", "UTF-8")
#pickle.dump(metacache, f)
f.close()

f = codecs.open("%s/user.map" % Journal, "w", "UTF-8")
#pickle.dump(usermap, f)
f.close()


Current Mood: exhausted
Thursday, March 28th, 2013
7:09 pm
[half_of_monty]
How can I view my journal locally?
Thanks for building this great tool. I've run it fine and seem to have downloaded everything. Now, how can I nicely view them?

I've flicked back through this forum and found a previous discussion of how non-l33t people like me might get a bit stuck, but there doesn't seem to be a nice answer there. Looks like I could import the whole thing to wordpress if I liked -- but I don't, I've moved on from this journal, I just want to keep it for old time's sake.

So what's the best way to view or convert it? Thanks.
Thursday, November 3rd, 2011
9:53 pm
[ron_newman]
ljdump can retrieve comments once again
At least it did for me, just now. If you've had trouble recently, try it again today.
Wednesday, November 2nd, 2011
8:58 am
[matveyp]
ljdump 1.5.1 fails with HTTP error 405
ljdump v1.5.1 fails. This is what I get:

$ ./ljdump.py
Fetching journal entries for: *********
Traceback (most recent call last):
  File "./ljdump.py", line 370, in <module>
    ljdump(server, username, password, username)
  File "./ljdump.py", line 132, in ljdump
    ljsession = getljsession(Server, Username, Password)
  File "./ljdump.py", line 60, in getljsession
    r = urllib2.urlopen(server+"/interface/flat", "mode=getchallenge")
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py", line 126, in urlopen
    return _opener.open(url, data, timeout)
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py", line 400, in open
    response = meth(req, response)
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py", line 513, in http_response
    'http', request, response, code, msg, hdrs)
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py", line 438, in error
    return self._call_chain(*args)
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py", line 372, in _call_chain
    result = func(*args)
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib2.py", line 521, in http_error_default
    raise HTTPError(req.get_full_url(), code, msg, hdrs, fp)
urllib2.HTTPError: HTTP Error 405: Method Not Allowed

Anyone else having this problem?
Friday, October 28th, 2011
11:47 am
[the_xtina]
Thursday, October 27th, 2011
12:44 am
[ron_newman]
What does "405 Method Not Allowed" error mean?
I'm getting this error when I run ljdump. This has never happened to me before. What do I need to do to fix it?

Python traceback behind cutCollapse )
Monday, August 22nd, 2011
4:20 pm
[suraimu]
Trying to use LJdump on Windows - I get this error:
C:\Utils\ljdump>ljdump.py
  File "C:\Utils\ljdump\ljdump.py", line 125
    print "Fetching journal entries for: %s" % Journal
                                           ^
SyntaxError: invalid syntax

C:\Utils\ljdump>
I've tried with ljdump.config being set up, I've tried renaming it so that maybe it would force-ask me each time - I even tried running it from an admin-empowered command prompt. Same error each time. Anyone know what the deal is? Thanks.
Saturday, August 20th, 2011
9:49 am
[khyron]
getting an error from LJ (ljprotocol.py), how to address?

I've been unsucsessfully trying to extract my LJ content into a wordpress.com account using their import tool.  The wordpress.com web form accepts my username and password and appears to successfully authenticate with LJ, but then immediately spits out an XML-RPC error. The same exact error is being reported by ljdump (most recent version, 1.5.1) when I run it on the command line on my Snow Leopard machine:

couldn't retrieve anum for entry at /home/lj/cgi-bin/ljprotocol.pl line 3952

If I understand this correctly, this appears to be an issue with LJ itself, correct? If so, what's the proper procedure for reporting this in the most helpful possible way?

I've been able to completely back up my journal using XJournal, but I don't know if the XML it produces (the .plist file) is actually of use to a wordpress.com importer, plus it does not appear to grab comments. I definitely want to preserve my comments.

Thanks in advance for any help!

Friday, August 5th, 2011
10:35 am
[red_valjok]
Dumping somebody else's journal
Why it is not provided?
Monday, August 1st, 2011
4:09 pm
[lease]
Connection Error?
I'm on a Mac (10.6) and getting the following error:


Fetching journal entries for: lease
Traceback (most recent call last):
File "./ljdump.py", line 390, in
ljdump(server, username, password, username)
File "./ljdump.py", line 132, in ljdump
ljsession = getljsession(Server, Username, Password)
File "./ljdump.py", line 60, in getljsession
r = urllib2.urlopen(server+"/interface/flat", "mode=getchallenge")
File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/urllib2.py", line 124, in urlopen
File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/urllib2.py", line 383, in open
File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/urllib2.py", line 401, in _open
File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/urllib2.py", line 361, in _call_chain
File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/urllib2.py", line 1130, in http_open
File "/System/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/urllib2.py", line 1105, in do_open
urllib2.URLError:
Friday, May 27th, 2011
11:13 am
[aryanhwy]
problem getting userpics
Well, it's not really a problem because I haven't added any new user pics in a few years, but I just ran ljdump (1.3.1), and when it reached the point of getting new user pics, I got this error:

Fetching userpics for: aryanhwy
Traceback (most recent call last):
File "./ljdump.py", line 285, in
pic = urllib2.urlopen(userpics[p])
File "/usr/lib64/python2.7/urllib2.py", line 126, in urlopen
return _opener.open(url, data, timeout)
File "/usr/lib64/python2.7/urllib2.py", line 391, in open
response = self._open(req, data)
File "/usr/lib64/python2.7/urllib2.py", line 409, in _open
'_open', req)
File "/usr/lib64/python2.7/urllib2.py", line 369, in _call_chain
result = func(*args)
File "/usr/lib64/python2.7/urllib2.py", line 1173, in http_open
return self.do_open(httplib.HTTPConnection, req)
File "/usr/lib64/python2.7/urllib2.py", line 1148, in do_open
raise URLError(err)
urllib2.URLError:


Any thoughts?
Wednesday, April 6th, 2011
6:54 pm
[ron_newman]
ljdump just hangs. how do I diagnose and fix?
I'm trying to use ljdump.py, as I have many times before without problems, but now it is hanging right after it says

Fetching journal entries for: davis_square

If I interrupt with control-C, it is stuck at line 168:

r = server.LJ.XMLRPC.syncitems(dochallenge(server, {
'username': Username,
'ver': 1,
'lastsync': lastsync,
'usejournal': Journal,
}, Password))


How can I fix this?
4:51 pm
[ledflyd]
error...
 The code backs up a couple of entries (btw 3-10) at a time and then outputs this error:


File "ljdump.py", line 390, in <module>
ljdump(server, username, password, username)
File "ljdump.py", line 189, in ljdump
}, Password))
File "/usr/lib/python2.6/xmlrpclib.py", line 1199, in __call__
return self.__send(self.__name, args)
File "/usr/lib/python2.6/xmlrpclib.py", line 1489, in __request
verbose=self.__verbose
File "/usr/lib/python2.6/xmlrpclib.py", line 1243, in request
headers


I start the program again and it adds a few more and crashes again. I'm running Ubuntu 10.04, if that's relevant

Separate question: are comments not supported anymore? Is it just the reply count but not the actual replies?

Thanks for this otherwise useful resource!
Friday, March 11th, 2011
9:14 pm
[dar205]
So now what do I do?
I am sort of confused with what to do with the files after I download them. I wanted to archive my old, retired journal (from 2002-6). I have the files on my computer. Is there some sort of reader I can use? I get that the files are in XML, will a standard XML viewer work?

The files are on my Ubuntu server, which I access from my Windows 7 machine. I would prefer a Windows solution, but I can SSH into the server if there is a command line solution. I would like to be able to look at and review the material.
[ << Previous 20 ]
My Website   About LiveJournal.com