Compare commits

..

14 Commits

Author SHA1 Message Date
rocky
39ce40004b Get ready for release 2.3.0 2016-04-30 11:22:58 -04:00
rocky
0a32a16d88 Python 3.0..3.2 bug in LOAD_FAST/STORE_LOCAL
LOAD_FAST         '__locals__'
STORE_LOCALS      ''

Also have to adjust doc constants for this crap

astnode.py: minor format change
2016-04-30 09:12:03 -04:00
rocky
4aa703d727 Test optimized Python code and Python 3.2 2016-04-30 06:54:01 -04:00
rocky
f3a4e6ee54 Pevious commit grammar change is Python 3.5 and up 2016-04-30 04:03:38 -04:00
rocky
43f5c5dcca Python 3.5 if statments decompyle
Sometimes it doesn't need JUMP_FORWARD _come_from _come_from

For example:

def handle2(module):
    if module == 'foo':
        try:
            module = 1
        except ImportError as exc:
            module = exc

    return module

And:

if __name__:
    for i in (1, 2):
        x = 3
2016-04-30 03:51:54 -04:00
rocky
3e49aa56bb spark -> spark_parser 2016-04-28 19:03:51 -04:00
rocky
9cc9fc99c2 Really remove spark - Use external package instead 2016-04-28 02:12:30 -04:00
R. Bernstein
2ebc558b40 Merge pull request #8 from rocky/external-spark
External spark
2016-04-27 23:30:36 -04:00
rocky
34a582b64c Administrivia 2016-04-27 23:26:31 -04:00
rocky
2711c8d06f Note dependencies on spark 2016-04-27 23:09:30 -04:00
rocky
40badefe9d Use external spark now. 2016-04-27 23:04:31 -04:00
rocky
d9ef5ff69a Back to 2.7.8 2016-04-20 05:31:38 -04:00
rocky
a4e839960f Try python 2.7.10 2016-04-20 05:16:25 -04:00
rocky
7b3c7e83ec Remove link to Mysterie uncompyle2 per request 2016-04-19 04:05:05 -04:00
28 changed files with 155 additions and 768 deletions

2
.gitignore vendored
View File

@@ -3,6 +3,8 @@
/.cache
/.eggs
/.python-version
/.tox
/README
/__pkginfo__.pyc
/dist
/how-to-make-a-release.txt

View File

@@ -9,6 +9,7 @@ python:
- '3.5'
install:
- pip install -r requirements.txt
- pip install -r requirements-dev.txt
script:

View File

@@ -1,6 +1,69 @@
2016-04-30 rocky <rocky@gnu.org>
* README.rst, __pkginfo__.py: Get ready for release 2.3.0
2016-04-30 rocky <rocky@gnu.org>
* uncompyle6/parsers/astnode.py, uncompyle6/parsers/parse3.py,
uncompyle6/semantics/pysource.py: Python 3.0..3.2 bug in
LOAD_FAST/STORE_LOCAL LOAD_FAST '__locals__' STORE_LOCALS '' Also have to adjust doc constants for this crap astnode.py: minor format change
2016-04-30 rocky <rocky@gnu.org>
* test/Makefile, test/simple_source/def/06_classbug.py,
test/test_pythonlib.py: Test optimized Python code and Python 3.2
2016-04-30 rocky <rocky@gnu.org>
* uncompyle6/parsers/parse3.py: Pevious commit grammar change is
Python 3.5 and up
2016-04-30 rocky <rocky@gnu.org>
* uncompyle6/parsers/parse3.py: Python 3.5 if statments decompyle Sometimes it doesn't need JUMP_FORWARD _come_from _come_from For example: def handle2(module): if module == 'foo': try: module = 1 except ImportError as exc: module = exc return module And: if __name__: for i in (1, 2): x = 3
2016-04-28 rocky <rocky@gnu.org>
* requirements.txt, uncompyle6/parser.py,
uncompyle6/parsers/parse2.py, uncompyle6/parsers/parse3.py,
uncompyle6/semantics/fragments.py, uncompyle6/semantics/pysource.py:
spark -> spark_parser
2016-04-28 rocky <rocky@gnu.org>
* uncompyle6/parsers/spark.py: Really remove spark - Use external
package instead
2016-04-27 R. Bernstein <rocky@users.noreply.github.com>
* : Merge pull request #8 from rocky/external-spark External spark
2016-04-27 rocky <rocky@gnu.org>
* .travis.yml, circle.yml: Note dependencies on spark
2016-04-27 rocky <rocky@gnu.org>
* .gitignore, README.rst, requirements.txt, uncompyle6/parser.py,
uncompyle6/parsers/astnode.py, uncompyle6/parsers/parse2.py,
uncompyle6/parsers/parse3.py, uncompyle6/semantics/fragments.py,
uncompyle6/semantics/pysource.py: Use external spark now.
2016-04-20 rocky <rocky@gnu.org>
* circle.yml: Back to 2.7.8
2016-04-20 rocky <rocky@gnu.org>
* circle.yml: Try python 2.7.10
2016-04-19 rocky <rocky@gnu.org>
* __pkginfo__.py: Get ready for release 2.2.0
* README.rst: Remove link to Mysterie uncompyle2 per request
2016-04-19 rocky <rocky@gnu.org>
* ChangeLog, NEWS, __pkginfo__.py: Get ready for release 2.2.0
2016-04-18 rocky <rocky@gnu.org>

6
NEWS
View File

@@ -1,3 +1,9 @@
uncompyle6 2.2.0 2016-04-30
- Spark is no longer here but pulled separate package spark_parse
- Python 3 parsing fixes
- More tests
uncompyle6 2.2.0 2016-04-02
- Support single-mode (in addtion to exec-mode) compilation

View File

@@ -1,4 +1,4 @@
|downloads| |buildstatus|
|buildstatus|
uncompyle6
==========
@@ -48,6 +48,8 @@ This uses setup.py, so it follows the standard Python routine:
::
pip install -r requirements.txt
pip install -r requirements-dev.txt
python setup.py install # may need sudo
# or if you have pyenv:
python setup.py develop
@@ -92,7 +94,6 @@ See Also
--------
* https://github.com/zrax/pycdc
* https://github.com/Mysterie/uncompyle2
* https://code.google.com/p/unpyc3/
The HISTORY file.

View File

@@ -47,7 +47,7 @@ def get_srcdir():
return os.path.realpath(filename)
ns = {}
version = '2.2.0'
version = '2.3.0'
web = 'https://github.com/rocky/python-uncompyle6/'
# tracebacks in zip files are funky and not debuggable

View File

@@ -6,7 +6,8 @@ machine:
dependencies:
override:
- pip install -r test-requirements.txt
- pip install -r requirements.txt
- pip install -r requirements-dev.txt
test:
override:
- python ./setup.py develop && make check-2.7

1
requirements.txt Normal file
View File

@@ -0,0 +1 @@
spark_parser >= 1.1.0

View File

@@ -1 +0,0 @@
pytest

View File

@@ -87,7 +87,11 @@ check-native-short:
check-2.7-ok:
$(PYTHON) test_pythonlib.py --ok-2.7 --verify $(COMPILE)
#: Run longer Python 2.7's lib files known to be okay
#: Run longer Python 3.2's lib files known to be okay
check-3.2-ok:
$(PYTHON) test_pythonlib.py --ok-3.2 --verify $(COMPILE)
#: Run longer Python 3.4's lib files known to be okay
check-3.4-ok:
$(PYTHON) test_pythonlib.py --ok-3.4 --verify $(COMPILE)

Binary file not shown.

Binary file not shown.

BIN
test/ok_lib2.7/_abcoll.pyc Normal file

Binary file not shown.

BIN
test/ok_lib2.7/_abcoll.pyo Normal file

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

View File

@@ -0,0 +1,13 @@
# Python 3.2 Bug from abc.py
# Make sure we handle:
# LOAD_FAST '__locals__'
# STORE_LOCALS ''
class abstractclassmethod(object):
"""A Python 3.2 STORE_LOCALS bug
"""
def __init__(self, callable):
callable.__isabstractmethod__ = True

View File

@@ -55,25 +55,28 @@ PYO = ('*.pyo', )
PYOC = ('*.pyc', '*.pyo')
test_options = {
# name: (src_basedir, pattern, output_base_suffix, pythoin_version)
# name: (src_basedir, pattern, output_base_suffix, python_version)
'test':
('test', PYC, 'test'),
'ok-2.6':
(os.path.join(src_dir, 'ok_2.6'),
PYC, 'ok-2.6', 2.6),
PYOC, 'ok-2.6', 2.6),
'ok-2.7': (os.path.join(src_dir, 'ok_lib2.7'),
PYC, 'ok-2.7', 2.7),
PYOC, 'ok-2.7', 2.7),
'ok-3.2': (os.path.join(src_dir, 'ok_lib3.2'),
PYOC, 'ok-3.2', 3.5),
'base-2.7': (os.path.join(src_dir, 'base_tests', 'python2.7'),
PYC, 'base_2.7', 2.7),
PYOC, 'base_2.7', 2.7),
}
for vers in (2.7, 3.4):
for vers in (2.7, 3.4, 3.5):
pythonlib = "ok_lib%s" % vers
key = "ok-%s" % vers
test_options[key] = (os.path.join(src_dir, pythonlib), PYC, key, vers)
test_options[key] = (os.path.join(src_dir, pythonlib), PYOC, key, vers)
pass
for vers in (2.5, 2.6, 2.7, 3.2, 3.3, 3.4, 3.5):
@@ -84,7 +87,7 @@ for vers in (2.5, 2.6, 2.7, 3.2, 3.3, 3.4, 3.5):
pythonlib = "python%s" % vers
if vers >= 3.0:
pythonlib = os.path.join(pythonlib, '__pycache__')
test_options[key] = (os.path.join(lib_prefix, pythonlib), PYC, pythonlib, vers)
test_options[key] = (os.path.join(lib_prefix, pythonlib), PYOC, pythonlib, vers)
#-----

View File

@@ -3,7 +3,7 @@
# Copyright (c) 2000-2002 by hartmut Goebel <h.goebel@crazy-compilers.com>
# Copyright (c) 1999 John Aycock
"""
Common spark parser routines Python.
Common uncompyle parser routines.
"""
from __future__ import print_function
@@ -11,7 +11,7 @@ from __future__ import print_function
import sys
from uncompyle6.code import iscode
from uncompyle6.parsers.spark import GenericASTBuilder, DEFAULT_DEBUG as PARSER_DEFAULT_DEBUG
from spark_parser import GenericASTBuilder, DEFAULT_DEBUG as PARSER_DEFAULT_DEBUG
class ParserError(Exception):
def __init__(self, token, offset):
@@ -239,7 +239,7 @@ def python_parser(version, co, out=sys.stdout, showasm=False,
if __name__ == '__main__':
def parse_test(co):
from uncompyl6 import PYTHON_VERSION
from uncompyle6 import PYTHON_VERSION
ast = python_parser(PYTHON_VERSION, co, showasm=True)
print(ast)
return

View File

@@ -10,8 +10,8 @@ else:
class AST(UserList):
def __init__(self, type, kids=[]):
self.type = intern(type)
def __init__(self, kind, kids=[]):
self.type = intern(kind)
UserList.__init__(self, kids)
def isNone(self):
@@ -28,7 +28,8 @@ class AST(UserList):
else:
return self.type == o
def __hash__(self): return hash(self.type)
def __hash__(self):
return hash(self.type)
def __repr__(self, indent=''):
rv = str(self.type)

View File

@@ -19,12 +19,16 @@ from __future__ import print_function
from uncompyle6.parser import PythonParser, nop_func
from uncompyle6.parsers.astnode import AST
from uncompyle6.parsers.spark import GenericASTBuilder, DEFAULT_DEBUG as PARSER_DEFAULT_DEBUG
from spark_parser import GenericASTBuilder, DEFAULT_DEBUG as PARSER_DEFAULT_DEBUG
from uncompyle6 import PYTHON3
class Python2Parser(PythonParser):
def __init__(self, debug_parser=PARSER_DEFAULT_DEBUG):
GenericASTBuilder.__init__(self, AST, 'stmts', debug=debug_parser)
if PYTHON3:
super().__init__(AST, 'stmts', debug=debug_parser)
else:
super(Python2Parser, self).__init__(AST, 'stmts', debug=debug_parser)
self.customized = {}
def p_list_comprehension(self, args):

View File

@@ -19,13 +19,17 @@ from __future__ import print_function
from uncompyle6.parser import PythonParser, nop_func
from uncompyle6.parsers.astnode import AST
from uncompyle6.parsers.spark import GenericASTBuilder, DEFAULT_DEBUG as PARSER_DEFAULT_DEBUG
from spark_parser import DEFAULT_DEBUG as PARSER_DEFAULT_DEBUG
from uncompyle6 import PYTHON3
class Python3Parser(PythonParser):
def __init__(self, debug_parser=PARSER_DEFAULT_DEBUG):
self.added_rules = set()
GenericASTBuilder.__init__(self, AST, 'stmts', debug=debug_parser)
if PYTHON3:
super().__init__(AST, 'stmts', debug=debug_parser)
else:
super(Python3Parser, self).__init__(AST, 'stmts', debug=debug_parser)
self.new_rules = set()
def add_unique_rule(self, rule, opname, count, customize):
@@ -150,6 +154,9 @@ class Python3Parser(PythonParser):
designList ::= designator designator
designList ::= designator DUP_TOP designList
# FIXME: Store local is only used in Python 3.2
designator ::= STORE_LOCALS
designator ::= STORE_FAST
designator ::= STORE_NAME
designator ::= STORE_GLOBAL
@@ -271,6 +278,9 @@ class Python3Parser(PythonParser):
_ifstmts_jump ::= return_if_stmts
_ifstmts_jump ::= c_stmts_opt JUMP_FORWARD _come_from _come_from
# FIXME: this optimization is only used in Python 3.5 and beyond
_ifstmts_jump ::= c_stmts_opt
iflaststmt ::= testexpr c_stmts_opt JUMP_ABSOLUTE
iflaststmtl ::= testexpr c_stmts_opt JUMP_BACK

View File

@@ -1,737 +0,0 @@
"""
Copyright (c) 1998-2002 John Aycock
Copyright (c) 2015 Rocky Bernstein
Permission is hereby granted, free of charge, to any person obtaining
a copy of this software and associated documentation files (the
"Software"), to deal in the Software without restriction, including
without limitation the rights to use, copy, modify, merge, publish,
distribute, sublicense, and/or sell copies of the Software, and to
permit persons to whom the Software is furnished to do so, subject to
the following conditions:
The above copyright notice and this permission notice shall be
included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.
IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY
CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE
SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
"""
from __future__ import print_function
import os, re
__version__ = 'SPARK-1.0 Python3 compatible'
def _namelist(instance):
namelist, namedict, classlist = [], {}, [instance.__class__]
for c in classlist:
for b in c.__bases__:
classlist.append(b)
for name in list(c.__dict__.keys()):
if name not in namedict:
namelist.append(name)
namedict[name] = 1
return namelist
class _State:
'''
Extracted from GenericParser and made global so that [un]picking works.
'''
def __init__(self, stateno, items):
self.T, self.complete, self.items = [], [], items
self.stateno = stateno
# DEFAULT_DEBUG = {'rules': True, 'transition': True, 'reduce' : True}
# DEFAULT_DEBUG = {'rules': False, 'transition': False, 'reduce' : True}
DEFAULT_DEBUG = {'rules': False, 'transition': False, 'reduce': False}
class GenericParser:
'''
An Earley parser, as per J. Earley, "An Efficient Context-Free
Parsing Algorithm", CACM 13(2), pp. 94-102. Also J. C. Earley,
"An Efficient Context-Free Parsing Algorithm", Ph.D. thesis,
Carnegie-Mellon University, August 1968. New formulation of
the parser according to J. Aycock, "Practical Earley Parsing
and the SPARK Toolkit", Ph.D. thesis, University of Victoria,
2001, and J. Aycock and R. N. Horspool, "Practical Earley
Parsing", unpublished paper, 2001.
'''
def __init__(self, start, debug=DEFAULT_DEBUG):
self.rules = {}
self.rule2func = {}
self.rule2name = {}
self.collectRules()
self.augment(start)
self.ruleschanged = True
self.debug = debug
_NULLABLE = '\e_'
_START = 'START'
_BOF = '|-'
#
# When pickling, take the time to generate the full state machine;
# some information is then extraneous, too. Unfortunately we
# can't save the rule2func map.
#
def __getstate__(self):
if self.ruleschanged:
#
# XXX - duplicated from parse()
#
self.computeNull()
self.newrules = {}
self.new2old = {}
self.makeNewRules()
self.ruleschanged = False
self.edges, self.cores = {}, {}
self.states = { 0: self.makeState0() }
self.makeState(0, self._BOF)
#
# XXX - should find a better way to do this..
#
changes = 1
while changes:
changes = 0
for k, v in list(self.edges.items()):
if v is None:
state, sym = k
if state in self.states:
self.goto(state, sym)
changes = 1
rv = self.__dict__.copy()
for s in list(self.states.values()):
del s.items
del rv['rule2func']
del rv['nullable']
del rv['cores']
return rv
def __setstate__(self, D):
self.rules = {}
self.rule2func = {}
self.rule2name = {}
self.collectRules()
start = D['rules'][self._START][0][1][1] # Blech.
self.augment(start)
D['rule2func'] = self.rule2func
D['makeSet'] = self.makeSet_fast
self.__dict__ = D
#
# A hook for GenericASTBuilder and GenericASTMatcher. Mess
# thee not with this; nor shall thee toucheth the _preprocess
# argument to addRule.
#
def preprocess(self, rule, func):
return rule, func
def addRule(self, doc, func, _preprocess=True):
"""Add a grammar rules to self.rules, self.rule2func and self.rule2name"""
fn = func
# remove blanks lines and comment lines, e.g. lines starting with "#"
doc = os.linesep.join([s for s in doc.splitlines() if s and not re.match("^\s*#", s)])
rules = doc.split()
index = []
for i in range(len(rules)):
if rules[i] == '::=':
index.append(i-1)
index.append(len(rules))
for i in range(len(index)-1):
lhs = rules[index[i]]
rhs = rules[index[i]+2:index[i+1]]
rule = (lhs, tuple(rhs))
if _preprocess:
rule, fn = self.preprocess(rule, func)
if lhs in self.rules:
self.rules[lhs].append(rule)
else:
self.rules[lhs] = [ rule ]
self.rule2func[rule] = fn
self.rule2name[rule] = func.__name__[2:]
self.ruleschanged = True
def collectRules(self):
for name in _namelist(self):
if name[:2] == 'p_':
func = getattr(self, name)
doc = func.__doc__
self.addRule(doc, func)
def augment(self, start):
rule = '%s ::= %s %s' % (self._START, self._BOF, start)
self.addRule(rule, lambda args: args[1], 0)
def computeNull(self):
self.nullable = {}
tbd = []
for rulelist in list(self.rules.values()):
lhs = rulelist[0][0]
self.nullable[lhs] = 0
for rule in rulelist:
rhs = rule[1]
if len(rhs) == 0:
self.nullable[lhs] = 1
continue
#
# We only need to consider rules which
# consist entirely of nonterminal symbols.
# This should be a savings on typical
# grammars.
#
for sym in rhs:
if sym not in self.rules:
break
else:
tbd.append(rule)
changes = 1
while changes:
changes = 0
for lhs, rhs in tbd:
if self.nullable[lhs]:
continue
for sym in rhs:
if not self.nullable[sym]:
break
else:
self.nullable[lhs] = 1
changes = 1
def makeState0(self):
s0 = _State(0, [])
for rule in self.newrules[self._START]:
s0.items.append((rule, 0))
return s0
def finalState(self, tokens):
#
# Yuck.
#
if len(self.newrules[self._START]) == 2 and len(tokens) == 0:
return 1
start = self.rules[self._START][0][1][1]
return self.goto(1, start)
def makeNewRules(self):
worklist = []
for rulelist in list(self.rules.values()):
for rule in rulelist:
worklist.append((rule, 0, 1, rule))
for rule, i, candidate, oldrule in worklist:
lhs, rhs = rule
n = len(rhs)
while i < n:
sym = rhs[i]
if sym not in self.rules or \
not self.nullable[sym]:
candidate = 0
i = i + 1
continue
newrhs = list(rhs)
newrhs[i] = self._NULLABLE+sym
newrule = (lhs, tuple(newrhs))
worklist.append((newrule, i+1,
candidate, oldrule))
candidate = 0
i = i + 1
else:
if candidate:
lhs = self._NULLABLE+lhs
rule = (lhs, rhs)
if lhs in self.newrules:
self.newrules[lhs].append(rule)
else:
self.newrules[lhs] = [ rule ]
self.new2old[rule] = oldrule
def typestring(self, token):
return None
def error(self, token):
print("Syntax error at or near `%s' token" % token)
raise SystemExit
def parse(self, tokens):
sets = [ [(1, 0), (2, 0)] ]
self.links = {}
if self.ruleschanged:
self.computeNull()
self.newrules = {}
self.new2old = {}
self.makeNewRules()
self.ruleschanged = False
self.edges, self.cores = {}, {}
self.states = { 0: self.makeState0() }
self.makeState(0, self._BOF)
for i in range(len(tokens)):
sets.append([])
if sets[i] == []:
break
self.makeSet(tokens[i], sets, i)
else:
sets.append([])
self.makeSet(None, sets, len(tokens))
finalitem = (self.finalState(tokens), 0)
if finalitem not in sets[-2]:
if len(tokens) > 0:
self.error(tokens[i-1])
else:
self.error(None)
return self.buildTree(self._START, finalitem,
tokens, len(sets)-2)
def isnullable(self, sym):
#
# For symbols in G_e only. If we weren't supporting 1.5,
# could just use sym.startswith().
#
return self._NULLABLE == sym[0:len(self._NULLABLE)]
def skip(self, xxx_todo_changeme, pos=0):
(lhs, rhs) = xxx_todo_changeme
n = len(rhs)
while pos < n:
if not self.isnullable(rhs[pos]):
break
pos = pos + 1
return pos
def makeState(self, state, sym):
assert sym is not None
# print(sym) # debug
#
# Compute \epsilon-kernel state's core and see if
# it exists already.
#
kitems = []
for rule, pos in self.states[state].items:
lhs, rhs = rule
if rhs[pos:pos+1] == (sym,):
kitems.append((rule, self.skip(rule, pos+1)))
tcore = tuple(sorted(kitems))
if tcore in self.cores:
return self.cores[tcore]
#
# Nope, doesn't exist. Compute it and the associated
# \epsilon-nonkernel state together; we'll need it right away.
#
k = self.cores[tcore] = len(self.states)
K, NK = _State(k, kitems), _State(k+1, [])
self.states[k] = K
predicted = {}
edges = self.edges
rules = self.newrules
for X in K, NK:
worklist = X.items
for item in worklist:
rule, pos = item
lhs, rhs = rule
if pos == len(rhs):
X.complete.append(rule)
continue
nextSym = rhs[pos]
key = (X.stateno, nextSym)
if nextSym not in rules:
if key not in edges:
edges[key] = None
X.T.append(nextSym)
else:
edges[key] = None
if nextSym not in predicted:
predicted[nextSym] = 1
for prule in rules[nextSym]:
ppos = self.skip(prule)
new = (prule, ppos)
NK.items.append(new)
#
# Problem: we know K needs generating, but we
# don't yet know about NK. Can't commit anything
# regarding NK to self.edges until we're sure. Should
# we delay committing on both K and NK to avoid this
# hacky code? This creates other problems..
#
if X is K:
edges = {}
if NK.items == []:
return k
#
# Check for \epsilon-nonkernel's core. Unfortunately we
# need to know the entire set of predicted nonterminals
# to do this without accidentally duplicating states.
#
tcore = tuple(sorted(predicted.keys()))
if tcore in self.cores:
self.edges[(k, None)] = self.cores[tcore]
return k
nk = self.cores[tcore] = self.edges[(k, None)] = NK.stateno
self.edges.update(edges)
self.states[nk] = NK
return k
def goto(self, state, sym):
key = (state, sym)
if key not in self.edges:
#
# No transitions from state on sym.
#
return None
rv = self.edges[key]
if rv is None:
#
# Target state isn't generated yet. Remedy this.
#
rv = self.makeState(state, sym)
self.edges[key] = rv
return rv
def gotoT(self, state, t):
if self.debug['rules']: print("Terminal", t, state)
return [self.goto(state, t)]
def gotoST(self, state, st):
if self.debug['transition']: print("GotoST", st, state)
rv = []
for t in self.states[state].T:
if st == t:
rv.append(self.goto(state, t))
return rv
def add(self, set, item, i=None, predecessor=None, causal=None):
if predecessor is None:
if item not in set:
set.append(item)
else:
key = (item, i)
if item not in set:
self.links[key] = []
set.append(item)
self.links[key].append((predecessor, causal))
def makeSet(self, token, sets, i):
cur, next = sets[i], sets[i+1]
ttype = token is not None and self.typestring(token) or None
if ttype is not None:
fn, arg = self.gotoT, ttype
else:
fn, arg = self.gotoST, token
for item in cur:
ptr = (item, i)
state, parent = item
add = fn(state, arg)
for k in add:
if k is not None:
self.add(next, (k, parent), i+1, ptr)
nk = self.goto(k, None)
if nk is not None:
self.add(next, (nk, i+1))
if parent == i:
continue
for rule in self.states[state].complete:
lhs, rhs = rule
if self.debug['reduce']:
print("%s ::= %s" % (lhs, ' '.join(rhs)))
for pitem in sets[parent]:
pstate, pparent = pitem
k = self.goto(pstate, lhs)
if k is not None:
why = (item, i, rule)
pptr = (pitem, parent)
self.add(cur, (k, pparent),
i, pptr, why)
nk = self.goto(k, None)
if nk is not None:
self.add(cur, (nk, i))
def makeSet_fast(self, token, sets, i):
#
# Call *only* when the entire state machine has been built!
# It relies on self.edges being filled in completely, and
# then duplicates and inlines code to boost speed at the
# cost of extreme ugliness.
#
cur, next = sets[i], sets[i+1]
ttype = token is not None and self.typestring(token) or None
for item in cur:
ptr = (item, i)
state, parent = item
if ttype is not None:
k = self.edges.get((state, ttype), None)
if k is not None:
# self.add(next, (k, parent), i+1, ptr)
# INLINED --------v
new = (k, parent)
key = (new, i+1)
if new not in next:
self.links[key] = []
next.append(new)
self.links[key].append((ptr, None))
# INLINED --------^
# nk = self.goto(k, None)
nk = self.edges.get((k, None), None)
if nk is not None:
# self.add(next, (nk, i+1))
# INLINED -------------v
new = (nk, i+1)
if new not in next:
next.append(new)
# INLINED ---------------^
else:
add = self.gotoST(state, token)
for k in add:
if k is not None:
self.add(next, (k, parent), i+1, ptr)
# nk = self.goto(k, None)
nk = self.edges.get((k, None), None)
if nk is not None:
self.add(next, (nk, i+1))
if parent == i:
continue
for rule in self.states[state].complete:
lhs, rhs = rule
for pitem in sets[parent]:
pstate, pparent = pitem
# k = self.goto(pstate, lhs)
k = self.edges.get((pstate, lhs), None)
if k is not None:
why = (item, i, rule)
pptr = (pitem, parent)
# self.add(cur, (k, pparent), i, pptr, why)
# INLINED ---------v
new = (k, pparent)
key = (new, i)
if new not in cur:
self.links[key] = []
cur.append(new)
self.links[key].append((pptr, why))
# INLINED ----------^
# nk = self.goto(k, None)
nk = self.edges.get((k, None), None)
if nk is not None:
# self.add(cur, (nk, i))
# INLINED ---------v
new = (nk, i)
if new not in cur:
cur.append(new)
# INLINED ----------^
def predecessor(self, key, causal):
for p, c in self.links[key]:
if c == causal:
return p
assert 0
def causal(self, key):
links = self.links[key]
if len(links) == 1:
return links[0][1]
choices = []
rule2cause = {}
for p, c in links:
rule = c[2]
choices.append(rule)
rule2cause[rule] = c
return rule2cause[self.ambiguity(choices)]
def deriveEpsilon(self, nt):
if len(self.newrules[nt]) > 1:
rule = self.ambiguity(self.newrules[nt])
else:
rule = self.newrules[nt][0]
# print(rule) # debug
rhs = rule[1]
attr = [None] * len(rhs)
for i in range(len(rhs)-1, -1, -1):
attr[i] = self.deriveEpsilon(rhs[i])
return self.rule2func[self.new2old[rule]](attr)
def buildTree(self, nt, item, tokens, k):
if self.debug['rules']:
print("NT", nt)
state, parent = item
choices = []
for rule in self.states[state].complete:
if rule[0] == nt:
choices.append(rule)
rule = choices[0]
if len(choices) > 1:
rule = self.ambiguity(choices)
# print(rule) # debug
rhs = rule[1]
attr = [None] * len(rhs)
for i in range(len(rhs)-1, -1, -1):
sym = rhs[i]
if sym not in self.newrules:
if sym != self._BOF:
attr[i] = tokens[k-1]
key = (item, k)
item, k = self.predecessor(key, None)
# elif self.isnullable(sym):
elif self._NULLABLE == sym[0:len(self._NULLABLE)]:
attr[i] = self.deriveEpsilon(sym)
else:
key = (item, k)
why = self.causal(key)
attr[i] = self.buildTree(sym, why[0],
tokens, why[1])
item, k = self.predecessor(key, why)
return self.rule2func[self.new2old[rule]](attr)
def ambiguity(self, rules):
#
# XXX - problem here and in collectRules() if the same rule
# appears in >1 method. Also undefined results if rules
# causing the ambiguity appear in the same method.
#
sortlist = []
name2index = {}
for i in range(len(rules)):
lhs, rhs = rule = rules[i]
name = self.rule2name[self.new2old[rule]]
sortlist.append((len(rhs), name))
name2index[name] = i
sortlist.sort()
list = [a_b[1] for a_b in sortlist]
return rules[name2index[self.resolve(list)]]
def resolve(self, list):
'''
Resolve ambiguity in favor of the shortest RHS.
Since we walk the tree from the top down, this
should effectively resolve in favor of a "shift".
'''
return list[0]
#
# GenericASTBuilder automagically constructs a concrete/abstract syntax tree
# for a given input. The extra argument is a class (not an instance!)
# which supports the "__setslice__" and "__len__" methods.
#
# XXX - silently overrides any user code in methods.
#
class GenericASTBuilder(GenericParser):
def __init__(self, AST, start, debug=DEFAULT_DEBUG):
GenericParser.__init__(self, start, debug=debug)
self.AST = AST
def preprocess(self, rule, func):
rebind = lambda lhs, self=self: \
lambda args, lhs=lhs, self=self: \
self.buildASTNode(args, lhs)
lhs, rhs = rule
return rule, rebind(lhs)
def buildASTNode(self, args, lhs):
children = []
for arg in args:
if isinstance(arg, self.AST):
children.append(arg)
else:
children.append(self.terminal(arg))
return self.nonterminal(lhs, children)
def terminal(self, token):
return token
def nonterminal(self, type, args):
rv = self.AST(type)
rv[:len(args)] = args
return rv
class GenericASTTraversalPruningException(BaseException):
pass
class GenericASTTraversal:
'''
GenericASTTraversal is a Visitor pattern according to Design Patterns. For
each node it attempts to invoke the method n_<node type>, falling
back onto the default() method if the n_* can't be found. The preorder
traversal also looks for an exit hook named n_<node type>_exit (no default
routine is called if it's not found). To prematurely halt traversal
of a subtree, call the prune() method -- this only makes sense for a
preorder traversal. Node type is determined via the typestring() method.
'''
def __init__(self, ast):
self.ast = ast
def typestring(self, node):
return node.type
def prune(self):
raise GenericASTTraversalPruningException
def preorder(self, node=None):
"""Walk the tree. For each node with typestring name *name* if the
node has a method called n_*name*, call that before walking
children. If there is no method define, call a
self.default(node) instead. Subclasses of GenericASTTtraversal
ill probably want to override this method.
If the node has a method called *name*_exit, that is called
after all children have been called. So in this sense this
function is both preorder and postorder combined.
"""
if node is None:
node = self.ast
try:
name = 'n_' + self.typestring(node)
if hasattr(self, name):
func = getattr(self, name)
func(node)
else:
self.default(node)
except GenericASTTraversalPruningException:
return
for kid in node:
self.preorder(kid)
name = name + '_exit'
if hasattr(self, name):
func = getattr(self, name)
func(node)
def default(self, node):
"""Default acttion to take on an ASTNode. Our defualt is to do nothing.
Subclasses will probably want to define this for other behavior."""
pass

View File

@@ -46,7 +46,7 @@ else:
from StringIO import StringIO
from uncompyle6.parsers.spark import GenericASTTraversal, GenericASTTraversalPruningException, \
from spark_parser import GenericASTTraversal, GenericASTTraversalPruningException, \
DEFAULT_DEBUG as PARSER_DEFAULT_DEBUG
from types import CodeType

View File

@@ -70,7 +70,7 @@ from uncompyle6 import PYTHON3
from uncompyle6.code import iscode
from uncompyle6.parser import get_python_parser
from uncompyle6.parsers.astnode import AST
from uncompyle6.parsers.spark import GenericASTTraversal, DEFAULT_DEBUG as PARSER_DEFAULT_DEBUG
from spark_parser import GenericASTTraversal, DEFAULT_DEBUG as PARSER_DEFAULT_DEBUG
from uncompyle6.scanner import Code, get_scanner
from uncompyle6.scanners.tok import Token, NoneToken
import uncompyle6.parser as python_parser
@@ -93,7 +93,6 @@ RETURN_LOCALS = AST('return_stmt',
[ AST('ret_expr', [AST('expr', [ Token('LOAD_LOCALS') ])]),
Token('RETURN_VALUE')])
NONE = AST('expr', [ NoneToken ] )
RETURN_NONE = AST('stmt',
@@ -1224,6 +1223,14 @@ class SourceWalker(GenericASTTraversal, object):
n_unpack_w_parens = n_unpack
def n_assign(self, node):
# A horrible hack for Python 3.0 .. 3.2
if 3.0 <= self.version <= 3.2 and len(node) == 2:
if (node[0][0] == 'LOAD_FAST' and node[0][0].pattr == '__locals__' and
node[1][0].type == 'STORE_LOCALS'):
self.prune()
self.default(node)
def n_assign2(self, node):
for n in node[-2:]:
if n[0] == 'unpack':
@@ -1538,12 +1545,20 @@ class SourceWalker(GenericASTTraversal, object):
pass
# if docstring exists, dump it
if (code.co_consts and code.co_consts[0] is not None
and len(ast) > 0 and ast[0][0] == ASSIGN_DOC_STRING(code.co_consts[0])):
if self.hide_internal:
if (code.co_consts and code.co_consts[0] is not None and len(ast) > 0):
do_doc = False
if (ast[0][0] == ASSIGN_DOC_STRING(code.co_consts[0])):
i = 0
do_doc = True
elif (len(ast) > 2 and 3.0 <= self.version <= 3.2 and
ast[2][0] == ASSIGN_DOC_STRING(code.co_consts[0])):
i = 2
do_doc = True
if do_doc and self.hide_internal:
self.print_docstring(indent, code.co_consts[0])
self.print_()
del ast[0]
del ast[i]
# the function defining a class normally returns locals(); we
# don't want this to show up in the source, thus remove the node