Compare commits

...

40 Commits

Author SHA1 Message Date
rocky
1e3ea60055 Get ready for release 2.9.10 2017-02-25 20:35:00 -05:00
rocky
2fbbc728b1 Python 2.6 parsing bugs ..
and some parser list nonterminal cleanup
2017-02-25 04:45:10 -05:00
rocky
0a6c8ba909 Python 2.6 control flow bug with added COME_FROM 2017-02-24 21:29:28 -05:00
rocky
d3904527e6 Python 2.5 wasn't handling tryelse properly 2017-02-22 05:38:30 -05:00
rocky
b043f6bafc New test doesn't --verify correctly. Sigh. 2017-02-20 09:22:01 -05:00
rocky
aa207a3c77 Add test for last while1 bug fix 2017-02-20 09:15:39 -05:00
rocky
747212c62c Python 3.x needs more "while 1" grammar rules 2017-02-20 08:57:16 -05:00
rocky
493e4b14a1 Some Python 3.4 bugss fixed by using 3.5 rules 2017-02-20 08:17:17 -05:00
rocky
9491c67779 More COME_FROM's in Python 3...
Need this to find boundaries of simple if better
2017-02-20 04:17:46 -05:00
rocky
8ef5e5d12b Marginally better for Python 2.6 but...
control flow is still wrong.
2017-02-19 08:12:15 -05:00
rocky
222986640e Merge branch 'coverage'
Beef up coverage
2017-02-10 02:09:28 -05:00
rocky
f9d47abb2b Reduce withas and with semantic footprint
This appears in python 2.5+ only. 2.5 is via "from future"
2017-02-10 02:08:52 -05:00
rocky
31ed869a6f Beef up grammar coverage 2017-02-10 02:03:28 -05:00
rocky
19d2569515 Changes based on grammar coverage info 2017-01-29 23:01:12 -05:00
R. Bernstein
9348411056 Merge pull request #83 from rocky/coverage
Coverage
2017-01-29 21:54:45 -05:00
rocky
e71dd010d7 Simplfy getting coverage
consts.py: notes on versions use which ops
2017-01-29 21:39:29 -05:00
rocky
dadd1c5c45 Add --coverage to test_pyenvlib and ...
improve grammar coverage on 2.7
2017-01-29 18:06:07 -05:00
rocky
99af1c9ffe Merge branch 'master' into coverage 2017-01-29 07:35:02 -05:00
rocky
3dc766d0a9 Update date 2017-01-29 07:34:49 -05:00
rocky
357005c814 Add --coverage option. WOOT! 2017-01-29 07:33:41 -05:00
rocky
41d63a0261 Bump min spark_parser version 2017-01-27 16:41:31 -05:00
rocky
1cb2cd7a82 More 2.6, 2.7 control flow
Todo more COME_FROMs but now need to check targets better. In some cases
we're relying on grammar ambiguity to work out right and in 2.7 it doesn't
2017-01-24 01:21:28 -05:00
rocky
9ec312ba5e More 2.6, 2.7 control-flow bugs
Wasn't limiting exception clause to try finally. Probably still has bugs
in try-finally nesting

Add another 2.6/2.7 COME_FROM to try to limit if/end scope better
2017-01-24 00:53:30 -05:00
rocky
597d51951e Improve Python 2.6 & 2.7 verification 2017-01-23 02:32:09 -05:00
rocky
cc2321f49e Fix up Python 3.0 handling 2017-01-22 03:45:40 -05:00
rocky
476a1c8ab5 Merge branch 'master' of github.com:rocky/python-uncompyle6 2017-01-21 06:25:54 -05:00
rocky
545a46dffa Correct spelling of Earley 2017-01-21 06:24:31 -05:00
rocky
8333e4ae93 Handle BUILD_CONST_KEY_MAP as a varargs
custom rules with BUILD_CONST_KEY_MAP are pinned to the specific number
of args seen.
2017-01-20 20:41:10 -05:00
R. Bernstein
e9057f378a Merge pull request #81 from moagstar/BUILD_CONST_KEY_MAP
fixed bug with BUILD_CONST_KEY_MAP
2017-01-19 20:43:10 -05:00
Daniel Bradburn
36b75abd90 fixed bug with BUILD_CONST_KEY_MAP 2017-01-19 21:58:56 +01:00
R. Bernstein
1528537ca4 Merge pull request #80 from moagstar/BUILD_CONST_KEY_MAP
Build const key map
2017-01-19 01:24:18 -05:00
Daniel Bradburn
6b8ae29267 added dev requirement six 2017-01-18 22:43:33 +01:00
Daniel Bradburn
33ec66a82f added generation of dict display from BUILD_CONST_KEY_MAP 2017-01-18 22:38:09 +01:00
Daniel Bradburn
b0493d1984 fixed typo 2017-01-18 22:34:12 +01:00
Daniel Bradburn
7f37c60c42 added some more test cases for BUILD_CONST_KEY_MAP 2017-01-18 22:33:44 +01:00
Daniel Bradburn
e2fd308928 simplified test cases for test_build_const_key_map 2017-01-17 23:07:27 +01:00
Daniel Bradburn
6d7cec002a added validation code for checking decompilation of an expression 2017-01-17 22:40:31 +01:00
rocky
9c49b5d54b Handle 3.6 BUILD_CONST_KEYMAP 2017-01-15 11:10:13 -05:00
rocky
8dc23e2cdc Python 2.1 doesn't have FOR_ITER or GET_ITER...
adjust locgic for this fact
2017-01-15 09:50:38 -05:00
rocky
a01b8be054 sys.recursionlimit is optional, not essential 2017-01-12 04:48:39 -05:00
35 changed files with 648 additions and 98 deletions

182
ChangeLog
View File

@@ -1,6 +1,186 @@
2017-02-25 rocky <rb@dustyfeet.com>
* uncompyle6/version.py: Get ready for release 2.9.10
2017-02-25 rocky <rb@dustyfeet.com>
* uncompyle6/parser.py, uncompyle6/parsers/parse26.py: Python 2.6
parsing bugs .. and some parser list nonterminal cleanup
2017-02-24 rocky <rb@dustyfeet.com>
* test/simple_source/bug25/03_if_for.py,
uncompyle6/parsers/parse26.py: Python 2.6 control flow bug with
added COME_FROM
2017-02-22 rocky <rb@dustyfeet.com>
* test/simple_source/bug25/02_try_else.py,
uncompyle6/parsers/parse25.py: Python 2.5 wasn't handling tryelse
properly
2017-02-20 rocky <rb@dustyfeet.com>
* : New test doesn't --verify correctly. Sigh.
2017-02-20 rocky <rb@dustyfeet.com>
* test/simple_source/bug33/02_while1.py: Add test for last while1
bug fix
2017-02-20 rocky <rb@dustyfeet.com>
* uncompyle6/parsers/parse3.py, uncompyle6/parsers/parse35.py:
Python 3.x needs more "while 1" grammar rules
2017-02-20 rocky <rb@dustyfeet.com>
* uncompyle6/parsers/parse3.py, uncompyle6/parsers/parse35.py,
uncompyle6/scanners/scanner3.py: Some Python 3.4 bugss fixed by
using 3.5 rules
2017-02-20 rocky <rb@dustyfeet.com>
* test/simple_source/exception/02_try_finally.py,
uncompyle6/parsers/parse3.py, uncompyle6/scanners/scanner3.py: More
COME_FROM's in Python 3... Need this to find boundaries of simple if better
2017-02-19 rocky <rb@dustyfeet.com>
* uncompyle6/parsers/parse26.py: Marginally better for Python 2.6
but... control flow is still wrong.
2017-02-10 rocky <rb@dustyfeet.com>
* : commit f9d47abb2be7c3839df06c0ed69d3d513694af4e Author: rocky
<rb@dustyfeet.com> Date: Fri Feb 10 02:07:04 2017 -0500
2017-02-10 rocky <rb@dustyfeet.com>
* test/simple_source/bug22/01_ops.py, test/test_pythonlib.py: Beef
up grammar coverage
2017-01-29 rocky <rb@dustyfeet.com>
* test/Makefile, test/simple_source/bug22/01_ops.py,
uncompyle6/parsers/parse25.py, uncompyle6/semantics/consts.py,
uncompyle6/semantics/pysource.py: Changes based on grammar coverage
info
2017-01-29 R. Bernstein <rocky@users.noreply.github.com>
* : Merge pull request #83 from rocky/coverage Coverage
2017-01-29 rocky <rb@dustyfeet.com>
* test/simple_source/bug22/01_ops.py, test/test_pyenvlib.py: Add
--coverage to test_pyenvlib and ... improve grammar coverage on 2.7
2017-01-29 rocky <rb@dustyfeet.com>
* : commit 3dc766d0a9537842470c7b4f79e8ccb3d5a46843 Author: rocky
<rb@dustyfeet.com> Date: Sun Jan 29 07:34:49 2017 -0500
2017-01-29 rocky <rb@dustyfeet.com>
* test/test_pythonlib.py: Add --coverage option. WOOT!
2017-01-27 rocky <rb@dustyfeet.com>
* __pkginfo__.py: Bump min spark_parser version
2017-01-24 rocky <rb@dustyfeet.com>
* uncompyle6/scanners/scanner2.py, uncompyle6/semantics/consts.py,
uncompyle6/semantics/pysource.py: More 2.6, 2.7 control flow Todo more COME_FROMs but now need to check targets better. In some
cases we're relying on grammar ambiguity to work out right and in
2.7 it doesn't
2017-01-24 rocky <rb@dustyfeet.com>
* uncompyle6/parsers/parse2.py, uncompyle6/parsers/parse27.py,
uncompyle6/scanners/scanner2.py, uncompyle6/semantics/consts.py,
uncompyle6/semantics/pysource.py: More 2.6, 2.7 control-flow bugs Wasn't limiting exception clause to try finally. Probably still has
bugs in try-finally nesting Add another 2.6/2.7 COME_FROM to try to limit if/end scope better
2017-01-23 rocky <rb@dustyfeet.com>
* uncompyle6/scanners/scanner2.py,
uncompyle6/scanners/scanner26.py, uncompyle6/verify.py: Improve
Python 2.6 & 2.7 verification
2017-01-22 rocky <rb@dustyfeet.com>
* uncompyle6/parsers/parse30.py, uncompyle6/verify.py: Fix up Python
3.0 handling
2017-01-21 rocky <rb@dustyfeet.com>
* : commit 545a46dffaa0fe2246dd7cc1b560c58db525c2b4 Author: rocky
<rb@dustyfeet.com> Date: Sat Jan 21 06:24:31 2017 -0500
2017-01-20 rocky <rb@dustyfeet.com>
* uncompyle6/parsers/parse3.py, uncompyle6/scanners/scanner3.py,
uncompyle6/semantics/pysource.py: Handle BUILD_CONST_KEY_MAP as a
varargs custom rules with BUILD_CONST_KEY_MAP are pinned to the specific
number of args seen.
2017-01-19 R. Bernstein <rocky@users.noreply.github.com>
* : Merge pull request #81 from moagstar/BUILD_CONST_KEY_MAP fixed bug with BUILD_CONST_KEY_MAP
2017-01-19 R. Bernstein <rocky@users.noreply.github.com>
* : Merge pull request #80 from moagstar/BUILD_CONST_KEY_MAP Build const key map
2017-01-18 Daniel Bradburn <moagstar@gmail.com>
* uncompyle6/semantics/pysource.py: added generation of dict display
from BUILD_CONST_KEY_MAP
2017-01-18 Daniel Bradburn <moagstar@gmail.com>
* pytest/test_build_const_key_map.py: fixed typo
2017-01-18 Daniel Bradburn <moagstar@gmail.com>
* pytest/test_build_const_key_map.py: added some more test cases for
BUILD_CONST_KEY_MAP
2017-01-17 Daniel Bradburn <moagstar@gmail.com>
* pytest/test_build_const_key_map.py: simplified test cases for
test_build_const_key_map
2017-01-17 Daniel Bradburn <moagstar@gmail.com>
* pytest/test_build_const_key_map.py, pytest/validate.py: added
validation code for checking decompilation of an expression
2017-01-15 rocky <rb@dustyfeet.com>
* uncompyle6/parsers/parse3.py, uncompyle6/scanners/scanner3.py:
Handle 3.6 BUILD_CONST_KEYMAP
2017-01-15 rocky <rb@dustyfeet.com>
* uncompyle6/scanners/scanner2.py: Python 2.1 doesn't have FOR_ITER
or GET_ITER... adjust locgic for this fact
2017-01-12 rocky <rb@dustyfeet.com>
* uncompyle6/__init__.py: sys.recursionlimit is optional, not
essential
2017-01-11 rocky <rb@dustyfeet.com>
* uncompyle6/version.py: Get ready for release 2.10.9
* : commit b131c20e99514d3a969a51e841d3a823017f1beb Author: rocky
<rb@dustyfeet.com> Date: Wed Jan 11 21:32:26 2017 -0500
2017-01-11 rocky <rb@dustyfeet.com>
* ChangeLog, NEWS: Get ready for release 2.10.9
2017-01-11 R. Bernstein <rocky@users.noreply.github.com>

View File

@@ -30,7 +30,7 @@ another clever idea: using table-driven semantics routines, using
format specifiers.
The last mention of a release of SPARK from John is around 2002. As
released, although the Early Algorithm parser was in good shape, this
released, although the Earley Algorithm parser was in good shape, this
code was woefully lacking as serious Python deparser.
In the fall of 2000, Hartmut Goebel
@@ -135,9 +135,9 @@ Hartmut a decade an a half ago:
NB. This is not a masterpiece of software, but became more like a hack.
Probably a complete rewrite would be sensefull. hG/2000-12-27
This project deparses using an Early-algorithm parse with lots of
This project deparses using an Earley-algorithm parse with lots of
massaging of tokens and the grammar in the scanner
phase. Early-algorithm parsers are context free and tend to be linear
phase. Earley-algorithm parsers are context free and tend to be linear
if the grammar is LR or left recursive.
Another approach that doesn't use grammars is to do something like

6
NEWS
View File

@@ -1,3 +1,9 @@
uncompyle6 2.9.10 2016-02-25
- Python grammar rule fixes
- Add ability to get grammar coverage on runs
- Handle Python 3.6 opcode BUILD_CONST_KEYMAP
uncompyle6 2.9.9 2016-12-16
- Remaining Python 3.5 ops handled

View File

@@ -9,7 +9,7 @@
# Things that change more often go here.
copyright = """
Copyright (C) 2015, 2016 Rocky Bernstein <rb@dustyfeet.com>.
Copyright (C) 2015-2017 Rocky Bernstein <rb@dustyfeet.com>.
"""
classifiers = ['Development Status :: 5 - Production/Stable',
@@ -39,7 +39,7 @@ entry_points={
'pydisassemble=uncompyle6.bin.pydisassemble:main',
]}
ftp_url = None
install_requires = ['spark-parser >= 1.5.1, < 1.6.0',
install_requires = ['spark-parser >= 1.6.0, < 1.7.0',
'xdis >= 3.2.4, < 3.3.0']
license = 'MIT'
mailing_list = 'python-debugger@googlegroups.com'

View File

@@ -0,0 +1,21 @@
import pytest
# uncompyle6
from uncompyle6 import PYTHON_VERSION
from validate import validate_uncompyle
@pytest.mark.skipif(PYTHON_VERSION < 3.6, reason='need at least python 3.6')
@pytest.mark.parametrize('text', (
"{0.: 'a', -1: 'b'}", # BUILD_MAP
"{'a':'b'}", # BUILD_MAP
"{0: 1}", # BUILD_MAP
"{b'0':1, b'2':3}", # BUILD_CONST_KEY_MAP
"{0: 1, 2: 3}", # BUILD_CONST_KEY_MAP
"{'a':'b','c':'d'}", # BUILD_CONST_KEY_MAP
"{0: 1, 2: 3}", # BUILD_CONST_KEY_MAP
"{'a': 1, 'b': 2}", # BUILD_CONST_KEY_MAP
"{'a':'b','c':'d'}", # BUILD_CONST_KEY_MAP
"{0.0:'b',0.1:'d'}", # BUILD_CONST_KEY_MAP
))
def test_build_const_key_map(text):
validate_uncompyle(text)

143
pytest/validate.py Normal file
View File

@@ -0,0 +1,143 @@
# future
from __future__ import print_function
# std
import os
import dis
import difflib
import subprocess
import tempfile
# compatability
import six
# uncompyle6 / xdis
from uncompyle6 import PYTHON_VERSION, deparse_code
def _dis_to_text(co):
return dis.Bytecode(co).dis()
def print_diff(original, uncompyled):
"""
Try and display a pretty html line difference between the original and
uncompyled code and bytecode if elinks and BeautifulSoup are installed
otherwise just show the diff.
:param original: Text describing the original code object.
:param uncompyled: Text describing the uncompyled code object.
"""
original_lines = original.split('\n')
uncompyled_lines = uncompyled.split('\n')
args = original_lines, uncompyled_lines, 'original', 'uncompyled'
try:
from bs4 import BeautifulSoup
diff = difflib.HtmlDiff().make_file(*args)
diff = BeautifulSoup(diff, "html.parser")
diff.select_one('table[summary="Legends"]').extract()
except ImportError:
print('\nTo display diff highlighting run:\n pip install BeautifulSoup4')
diff = difflib.HtmlDiff().make_table(*args)
with tempfile.NamedTemporaryFile(delete=False) as f:
f.write(str(diff).encode('utf-8'))
try:
print()
html = subprocess.check_output([
'elinks',
'-dump',
'-no-references',
'-dump-color-mode',
'1',
f.name,
]).decode('utf-8')
print(html)
except:
print('\nFor side by side diff install elinks')
diff = difflib.Differ().compare(original_lines, uncompyled_lines)
print('\n'.join(diff))
finally:
os.unlink(f.name)
def are_instructions_equal(i1, i2):
"""
Determine if two instructions are approximately equal,
ignoring certain fields which we allow to differ, namely:
* code objects are ignore (should probaby be checked) due to address
* line numbers
:param i1: left instruction to compare
:param i2: right instruction to compare
:return: True if the two instructions are approximately equal, otherwise False.
"""
result = (1==1
and i1.opname == i2.opname
and i1.opcode == i2.opcode
and i1.arg == i2.arg
# ignore differences due to code objects
# TODO : Better way of ignoring address
and (i1.argval == i2.argval or '<code object' in str(i1.argval))
# TODO : Should probably recurse to check code objects
and (i1.argrepr == i2.argrepr or '<code object' in i1.argrepr)
and i1.offset == i2.offset
# ignore differences in line numbers
#and i1.starts_line
and i1.is_jump_target == i2.is_jump_target
)
return result
def are_code_objects_equal(co1, co2):
"""
Determine if two code objects are approximately equal,
see are_instructions_equal for more information.
:param i1: left code object to compare
:param i2: right code object to compare
:return: True if the two code objects are approximately equal, otherwise False.
"""
# TODO : Use xdis for python2 compatability
instructions1 = dis.Bytecode(co1)
instructions2 = dis.Bytecode(co2)
for opcode1, opcode2 in zip(instructions1, instructions2):
if not are_instructions_equal(opcode1, opcode2):
return False
return True
def validate_uncompyle(text, mode='exec'):
"""
Validate decompilation of the given source code.
:param text: Source to validate decompilation of.
"""
original_code = compile(text, '<string>', mode)
original_dis = _dis_to_text(original_code)
original_text = text
deparsed = deparse_code(PYTHON_VERSION, original_code,
compile_mode=mode, out=six.StringIO())
uncompyled_text = deparsed.text
uncompyled_code = compile(uncompyled_text, '<string>', 'exec')
if not are_code_objects_equal(uncompyled_code, original_code):
uncompyled_dis = _dis_to_text(uncompyled_text)
def output(text, dis):
width = 60
return '\n\n'.join([
' SOURCE CODE '.center(width, '#'),
text.strip(),
' BYTECODE '.center(width, '#'),
dis
])
original = output(original_text, original_dis)
uncompyled = output(uncompyled_text, uncompyled_dis)
print_diff(original, uncompyled)
assert 'original' == 'uncompyled'

View File

@@ -1,3 +1,4 @@
pytest
flake8
hypothesis
six

View File

@@ -98,6 +98,21 @@ check-bytecode-2.4:
check-bytecode-2.5:
$(PYTHON) test_pythonlib.py --bytecode-2.5
#: Get grammar coverage for Python 2.5
grammar-coverage-2.5:
SPARK_PARSER_COVERAGE=/tmp/spark-grammar-25.cover $(PYTHON) test_pythonlib.py --bytecode-2.5
SPARK_PARSER_COVERAGE=/tmp/spark-grammar-25.cover $(PYTHON) test_pyenvlib.py --2.5.6
#: Get grammar coverage for Python 2.6
grammar-coverage-2.6:
SPARK_PARSER_COVERAGE=/tmp/spark-grammar-26.cover $(PYTHON) test_pythonlib.py --bytecode-2.6
SPARK_PARSER_COVERAGE=/tmp/spark-grammar-26.cover $(PYTHON) test_pyenvlib.py --2.6.9
#: Get grammar coverage for Python 2.7
grammar-coverage-2.7:
SPARK_PARSER_COVERAGE=/tmp/spark-grammar-27.cover $(PYTHON) test_pythonlib.py --bytecode-2.7
SPARK_PARSER_COVERAGE=/tmp/spark-grammar-27.cover $(PYTHON) test_pyenvlib.py --2.7.13
#: Check deparsing Python 2.6
check-bytecode-2.6:
$(PYTHON) test_pythonlib.py --bytecode-2.6 --weak-verify

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

View File

@@ -0,0 +1,18 @@
# Statements to beef up grammar coverage rules
# Force "inplace" ops
y = +10 # UNARY_POSITIVE
y /= 1 # INPLACE_DIVIDE
y %= 4 # INPLACE_MODULO
y **= 1 # INPLACE POWER
y >>= 2 # INPLACE_RSHIFT
y <<= 2 # INPLACE_LSHIFT
y //= 1 # INPLACE_TRUE_DIVIDE
y &= 1 # INPLACE_AND
y ^= 1 # INPLACE_XOR
`y` # UNARY_CONVERT - No in Python 3.x
# Beef up augassign and STORE_SLICE+3
x = [1,2,3,4,5]
x[0:1] = 1
x[0:3] += 1, 2, 3

View File

@@ -1,7 +1,6 @@
# Python 2.5 bug
# Was turning into tryelse when there in fact is no else.
def options(self, section):
"""Return a list of option names for the given section name."""
try:
opts = self._sections[section].copy()
except KeyError:
@@ -10,3 +9,19 @@ def options(self, section):
if '__name__' in opts:
del opts['__name__']
return opts.keys()
# From python2.5/distutils/command/register.py
def post_to_server(self, urllib2):
try:
result = 5
except urllib2.HTTPError, e:
result = e.code, e.msg
except urllib2.URLError, e:
result = 500
else:
if self.show_response:
result = 10
result = 200
if self.show_response:
result = 8
return result

View File

@@ -0,0 +1,7 @@
# From Python 2.6. distutils/sysconfig.py
def get_config_vars(_config_vars, args):
if _config_vars:
if args == 1:
if args < 8:
for key in ('LDFLAGS', 'BASECFLAGS'):
_config_vars[key] = 4

View File

@@ -0,0 +1,14 @@
# From Python 3.4 mailcap
def readmailcapfile(caps):
while 1:
line = 'abc'
if line[0] == '#' or line == '':
continue
key, fields = (1,2)
if not (key and fields):
continue
if key in caps:
caps[key].append(fields)
else:
caps[key] = [fields]
return caps

View File

@@ -0,0 +1,11 @@
# From 2.6.9 cmd.py
try:
if __file__:
x = 2
x = 3
finally:
if x and __file__:
try:
x = 1
except:
pass

View File

@@ -30,7 +30,7 @@ from fnmatch import fnmatch
TEST_VERSIONS=('2.3.7', '2.4.6', '2.5.6', '2.6.9',
'pypy-2.4.0', 'pypy-2.6.1',
'pypy-5.0.1', 'pypy-5.3.1',
'2.7.10', '2.7.11', '2.7.12',
'2.7.10', '2.7.11', '2.7.12', '2.7.13',
'3.0.1', '3.1.5', '3.2.6',
'3.3.5', '3.3.6',
'3.4.2', '3.5.1', '3.6.0')
@@ -106,28 +106,40 @@ def do_tests(src_dir, patterns, target_dir, start_with=None, do_verify=False):
if __name__ == '__main__':
import getopt, sys
do_verify = False
do_coverage = do_verify = False
test_dirs = []
start_with = None
test_options_keys = list(test_options.keys())
test_options_keys.sort()
opts, args = getopt.getopt(sys.argv[1:], '',
['start-with=', 'verify', 'weak-verify', 'all', ] \
['start-with=', 'verify', 'weak-verify',
'coverage', 'all', ] \
+ test_options_keys )
vers = ''
for opt, val in opts:
if opt == '--verify':
do_verify = True
if opt == '--weak-verify':
do_verify = 'weak'
if opt == '--coverage':
do_coverage = True
elif opt == '--start-with':
start_with = val
elif opt[2:] in test_options_keys:
test_dirs.append(test_options[opt[2:]])
triple = test_options[opt[2:]]
vers = triple[-1]
test_dirs.append(triple)
elif opt == '--all':
vers = 'all'
for val in test_options_keys:
test_dirs.append(test_options[val])
if do_coverage:
os.environ['SPARK_PARSER_COVERAGE'] = (
'/tmp/spark-grammar-%s.cover' % vers
)
for src_dir, pattern, target_dir in test_dirs:
if os.path.exists(src_dir):
target_dir = os.path.join(target_base, target_dir)

View File

@@ -190,6 +190,7 @@ if __name__ == '__main__':
test_options_keys.sort()
opts, args = getopt.getopt(sys.argv[1:], '',
['start-with=', 'verify', 'weak-verify', 'all', 'compile',
'coverage',
'no-rm'] \
+ test_options_keys )
if not opts: help()
@@ -198,7 +199,8 @@ if __name__ == '__main__':
'do_compile': False,
'do_verify': False,
'start_with': None,
'rmtree' : True
'rmtree' : True,
'coverage' : False
}
for opt, val in opts:
@@ -217,11 +219,18 @@ if __name__ == '__main__':
elif opt == '--all':
for val in test_options_keys:
test_dirs.append(test_options[val])
elif opt == '--coverage':
test_opts['coverage'] = True
else:
help()
pass
pass
if test_opts['coverage']:
os.environ['SPARK_PARSER_COVERAGE'] = (
'/tmp/spark-grammar-python-lib%s.cover' % test_dirs[0][-1]
)
last_compile_version = None
for src_dir, pattern, target_dir, compiled_version in test_dirs:
if os.path.isdir(src_dir):

View File

@@ -41,7 +41,9 @@ PYTHON_VERSION_STR = "%s.%s" % (sys.version_info[0], sys.version_info[1])
IS_PYPY = '__pypy__' in sys.builtin_module_names
sys.setrecursionlimit(5000)
if hasattr(sys, 'setrecursionlimit'):
# pyston doesn't have setrecursionlimit
sys.setrecursionlimit(5000)
import uncompyle6.semantics.pysource
import uncompyle6.semantics.fragments

View File

@@ -1,4 +1,4 @@
# Copyright (c) 2015-2016 Rocky Bernstein
# Copyright (c) 2015-2017 Rocky Bernstein
# Copyright (c) 2005 by Dan Pascu <dan@windowmaker.org>
# Copyright (c) 2000-2002 by hartmut Goebel <h.goebel@crazy-compilers.com>
# Copyright (c) 1999 John Aycock
@@ -28,6 +28,16 @@ nop_func = lambda self, args: None
class PythonParser(GenericASTBuilder):
def __init__(self, AST, start, debug):
super(PythonParser, self).__init__(AST, start, debug)
self.collect = frozenset(
['stmts', 'except_stmts', '_stmts',
'exprlist', 'kvlist', 'kwargs', 'come_froms',
# Python < 3
'print_items',
# PyPy:
'kvlist_n'])
def add_unique_rule(self, rule, opname, count, customize):
"""Add rule to grammar, but only if it hasn't been added previously
opname and count are used in the customize() semantic the actions
@@ -115,11 +125,7 @@ class PythonParser(GenericASTBuilder):
return token.type
def nonterminal(self, nt, args):
collect = ('stmts', 'exprlist', 'kvlist', '_stmts', 'print_items', 'kwargs',
# PYPY:
'kvlist_n')
if nt in collect and len(args) > 1:
if nt in self.collect and len(args) > 1:
#
# Collect iterated thingies together. That is rather than
# stmts -> stmts stmt -> stmts stmt -> ...

View File

@@ -126,6 +126,7 @@ class Python2Parser(PythonParser):
assert_expr_and ::= assert_expr jmp_false expr
ifstmt ::= testexpr _ifstmts_jump
ifstmt ::= testexpr return_if_stmts COME_FROM
testexpr ::= testfalse
testexpr ::= testtrue
@@ -144,6 +145,8 @@ class Python2Parser(PythonParser):
ifelsestmtr ::= testexpr return_if_stmts return_stmts
ifelsestmtr ::= testexpr return_if_stmts COME_FROM return_stmts
ifelsestmtl ::= testexpr c_stmts_opt JUMP_BACK else_suitel

View File

@@ -1,4 +1,4 @@
# Copyright (c) 2016 Rocky Bernstein
# Copyright (c) 2016-2017 Rocky Bernstein
"""
spark grammar differences over Python2.6 for Python 2.5.
"""
@@ -20,13 +20,18 @@ class Python25Parser(Python26Parser):
return_if_stmt ::= ret_expr RETURN_END_IF JUMP_BACK
# Python 2.6 uses ROT_TWO instead of the STORE_xxx
# withas is allowed as a "from future" in 2.5
setupwithas ::= DUP_TOP LOAD_ATTR store LOAD_ATTR CALL_FUNCTION_0
setup_finally
store ::= STORE_FAST
store ::= STORE_NAME
# Python 2.6 omits ths LOAD_FAST DELETE_FAST below
tryelsestmt ::= SETUP_EXCEPT suite_stmts_opt POP_BLOCK
try_middle else_suite COME_FROM
# Python 2.6 omits the LOAD_FAST DELETE_FAST below
# withas is allowed as a "from future" in 2.5
withasstmt ::= expr setupwithas designator suite_stmts_opt
POP_BLOCK LOAD_CONST COME_FROM
with_cleanup
@@ -46,18 +51,6 @@ class Python25Parser(Python26Parser):
tokens, first, last)
if invalid:
return invalid
lhs = rule[0]
if lhs in ('tryelsestmt', ):
# The end of the else part of try/else come_from has to come
# from an END_FINALLY statement
if tokens[last-1].type.startswith('COME_FROM'):
end_finally_offset = int(tokens[last-1].pattr)
current = first
while current < last:
offset = tokens[current].offset
if offset == end_finally_offset:
return tokens[current].type != 'END_FINALLY'
current += 1
return False

View File

@@ -1,4 +1,4 @@
# Copyright (c) 2016 Rocky Bernstein
# Copyright (c) 2017 Rocky Bernstein
"""
spark grammar differences over Python2 for Python 2.6.
"""
@@ -22,25 +22,22 @@ class Python26Parser(Python2Parser):
JUMP_IF_FALSE POP_TOP POP_TOP designator POP_TOP
try_middle ::= JUMP_FORWARD COME_FROM except_stmts
come_from_pop END_FINALLY COME_FROM
come_from_pop END_FINALLY come_froms
try_middle ::= JUMP_FORWARD COME_FROM except_stmts END_FINALLY
come_froms
try_middle ::= jmp_abs COME_FROM except_stmts
POP_TOP END_FINALLY
try_middle ::= jmp_abs COME_FROM except_stmts
POP_TOP END_FINALLY
trystmt ::= SETUP_EXCEPT suite_stmts_opt POP_TOP
try_middle
END_FINALLY JUMP_FORWARD
# Sometimes we don't put in COME_FROM to the next statement
# like we do in 2.7. Perhaps we should?
trystmt ::= SETUP_EXCEPT suite_stmts_opt POP_BLOCK
try_middle
trystmt ::= SETUP_EXCEPT suite_stmts_opt POP_BLOCK
try_middle come_froms
tryelsestmt ::= SETUP_EXCEPT suite_stmts_opt POP_BLOCK
try_middle else_suite COME_FROM
@@ -151,6 +148,8 @@ class Python26Parser(Python2Parser):
iflaststmtl ::= testexpr c_stmts_opt JUMP_BACK come_from_pop
iflaststmt ::= testexpr c_stmts_opt JUMP_ABSOLUTE come_from_pop
lastc_stmt ::= iflaststmt COME_FROM
while1stmt ::= SETUP_LOOP l_stmts_opt JUMP_BACK COME_FROM
ifstmt ::= testexpr_then _ifstmts_jump

View File

@@ -61,7 +61,7 @@ class Python27Parser(Python2Parser):
ret_and ::= expr JUMP_IF_FALSE_OR_POP ret_expr_or_cond COME_FROM
ret_or ::= expr JUMP_IF_TRUE_OR_POP ret_expr_or_cond COME_FROM
ret_cond ::= expr POP_JUMP_IF_FALSE expr RETURN_END_IF ret_expr_or_cond
ret_cond ::= expr POP_JUMP_IF_FALSE expr RETURN_END_IF COME_FROM ret_expr_or_cond
ret_cond_not ::= expr POP_JUMP_IF_TRUE expr RETURN_END_IF ret_expr_or_cond
or ::= expr JUMP_IF_TRUE_OR_POP expr COME_FROM

View File

@@ -44,6 +44,10 @@ class Python3Parser(PythonParser):
list_for ::= expr FOR_ITER designator list_iter jb_or_c
# This is seen in PyPy, but possibly it appears on other Python 3?
list_if ::= expr jmp_false list_iter COME_FROM
list_if_not ::= expr jmp_true list_iter COME_FROM
jb_or_c ::= JUMP_BACK
jb_or_c ::= CONTINUE
@@ -52,6 +56,9 @@ class Python3Parser(PythonParser):
setcomp_func ::= BUILD_SET_0 LOAD_FAST FOR_ITER designator comp_iter
JUMP_BACK RETURN_VALUE RETURN_LAST
setcomp_func ::= BUILD_SET_0 LOAD_FAST FOR_ITER designator comp_iter
COME_FROM JUMP_BACK RETURN_VALUE RETURN_LAST
comp_body ::= dict_comp_body
comp_body ::= set_comp_body
dict_comp_body ::= expr expr MAP_ADD
@@ -113,9 +120,11 @@ class Python3Parser(PythonParser):
classdefdeco1 ::= expr classdefdeco1 CALL_FUNCTION_1
classdefdeco1 ::= expr classdefdeco2 CALL_FUNCTION_1
assert ::= assert_expr jmp_true LOAD_ASSERT RAISE_VARARGS_1
assert2 ::= assert_expr jmp_true LOAD_ASSERT expr CALL_FUNCTION_1 RAISE_VARARGS_1
assert2 ::= assert_expr jmp_true LOAD_ASSERT expr RAISE_VARARGS_2
assert ::= assert_expr jmp_true LOAD_ASSERT RAISE_VARARGS_1 COME_FROM
assert2 ::= assert_expr jmp_true LOAD_ASSERT expr CALL_FUNCTION_1
RAISE_VARARGS_1 COME_FROM
assert2 ::= assert_expr jmp_true LOAD_ASSERT expr
RAISE_VARARGS_2 COME_FROM
assert_expr ::= expr
assert_expr ::= assert_expr_or
@@ -132,6 +141,7 @@ class Python3Parser(PythonParser):
_ifstmts_jump ::= return_if_stmts
_ifstmts_jump ::= c_stmts_opt JUMP_FORWARD COME_FROM
_ifstmts_jump ::= c_stmts_opt COME_FROM
iflaststmt ::= testexpr c_stmts_opt JUMP_ABSOLUTE
@@ -155,6 +165,7 @@ class Python3Parser(PythonParser):
ifelsestmtr ::= testexpr return_if_stmts return_stmts
ifelsestmtl ::= testexpr c_stmts_opt JUMP_BACK else_suitel
ifelsestmtl ::= testexpr c_stmts_opt COME_FROM JUMP_BACK else_suitel
# FIXME: this feels like a hack. Is it just 1 or two
# COME_FROMs? the parsed tree for this and even with just the
@@ -329,6 +340,9 @@ class Python3Parser(PythonParser):
forelselaststmtl ::= SETUP_LOOP expr _for designator for_block POP_BLOCK else_suitel
COME_FROM_LOOP
whilestmt ::= SETUP_LOOP testexpr l_stmts_opt COME_FROM JUMP_BACK POP_BLOCK
COME_FROM_LOOP
whilestmt ::= SETUP_LOOP testexpr l_stmts_opt JUMP_BACK POP_BLOCK
COME_FROM_LOOP
@@ -356,6 +370,9 @@ class Python3Parser(PythonParser):
while1stmt ::= SETUP_LOOP l_stmts
while1stmt ::= SETUP_LOOP l_stmts COME_FROM_LOOP
while1stmt ::= SETUP_LOOP l_stmts COME_FROM JUMP_BACK COME_FROM_LOOP
# FIXME: investigate - can code really produce a NOP?
whileTruestmt ::= SETUP_LOOP l_stmts_opt JUMP_BACK NOP
COME_FROM_LOOP
@@ -611,11 +628,17 @@ class Python3Parser(PythonParser):
rule = kvlist_n + ' ::= ' + 'expr ' * (token.attr*2)
self.add_unique_rule(rule, opname, token.attr, customize)
rule = "mapexpr ::= %s %s" % (kvlist_n, opname)
else:
rule = kvlist_n + ' ::= ' + 'expr expr STORE_MAP ' * token.attr
self.add_unique_rule(rule, opname, token.attr, customize)
rule = "mapexpr ::= %s %s" % (opname, kvlist_n)
self.add_unique_rule(rule, opname, token.attr, customize)
elif opname_base == 'BUILD_CONST_KEY_MAP':
# This is in 3.6+
kvlist_n = 'expr ' * (token.attr)
rule = "mapexpr ::= %sLOAD_CONST %s" % (kvlist_n, opname)
self.add_unique_rule(rule, opname, token.attr, customize)
elif opname_base in ('UNPACK_EX',):
before_count, after_count = token.attr
rule = 'unpack ::= ' + opname + ' designator' * (before_count + after_count + 1)

View File

@@ -5,9 +5,9 @@ spark grammar differences over Python 3.1 for Python 3.0.
from __future__ import print_function
from uncompyle6.parser import PythonParserSingle
from uncompyle6.parsers.parse3 import Python3Parser
from uncompyle6.parsers.parse31 import Python31Parser
class Python30Parser(Python3Parser):
class Python30Parser(Python31Parser):
def p_30(self, args):
"""
@@ -44,5 +44,10 @@ class Python30Parser(Python3Parser):
setup_finally ::= STORE_FAST SETUP_FINALLY LOAD_FAST DELETE_FAST
"""
def add_custom_rules(self, tokens, customize):
super(Python30Parser, self).add_custom_rules(tokens, customize)
return
pass
class Python30ParserSingle(Python30Parser, PythonParserSingle):
pass

View File

@@ -20,6 +20,9 @@ class Python35Parser(Python34Parser):
# I'm sure by the time Python 4 comes around these will be turned
# into special opcodes
while1stmt ::= SETUP_LOOP l_stmts COME_FROM JUMP_BACK
POP_BLOCK COME_FROM_LOOP
# Python 3.5+ Await statement
stmt ::= await_stmt
await_stmt ::= call_function GET_AWAITABLE LOAD_CONST YIELD_FROM POP_TOP
@@ -110,7 +113,6 @@ class Python35Parser(Python34Parser):
yield_from ::= expr GET_YIELD_FROM_ITER LOAD_CONST YIELD_FROM
_ifstmts_jump ::= c_stmts_opt COME_FROM
"""
def add_custom_rules(self, tokens, customize):

View File

@@ -159,7 +159,6 @@ class Scanner2(scan.Scanner):
# we sort them). That way, specific COME_FROM tags will match up
# properly. For example, a "loop" with an "if" nested in it should have the
# "loop" tag last so the grammar rule matches that properly.
# last_offset = -1
for jump_offset in sorted(jump_targets[offset], reverse=True):
# if jump_offset == last_offset:
# continue
@@ -265,13 +264,14 @@ class Scanner2(scan.Scanner):
# rule for that.
target = self.get_target(offset)
if target <= offset:
op_name = 'JUMP_BACK'
if (offset in self.stmts
and self.code[offset+3] not in (self.opc.END_FINALLY,
self.opc.POP_BLOCK)
and offset not in self.not_continue):
self.opc.POP_BLOCK)):
if ((offset in self.linestartoffsets and
self.code[self.prev[offset]] == self.opc.JUMP_ABSOLUTE)
or offset not in self.not_continue):
op_name = 'CONTINUE'
else:
op_name = 'JUMP_BACK'
elif op == self.opc.LOAD_GLOBAL:
if offset in self.load_asserts:
@@ -427,7 +427,7 @@ class Scanner2(scan.Scanner):
j = self.prev[s]
while code[j] in self.designator_ops:
j = self.prev[j]
if self.version >= 2.1 and code[j] == self.opc.FOR_ITER:
if self.version > 2.1 and code[j] == self.opc.FOR_ITER:
stmts.remove(s)
continue
last_stmt = s
@@ -589,7 +589,7 @@ class Scanner2(scan.Scanner):
target = self.get_target(jump_back, self.opc.JUMP_ABSOLUTE)
if (self.version >= 2.0 and
if (self.version > 2.1 and
code[target] in (self.opc.FOR_ITER, self.opc.GET_ITER)):
loop_type = 'for'
else:
@@ -617,7 +617,7 @@ class Scanner2(scan.Scanner):
'start': jump_back+3,
'end': end})
elif op == self.opc.SETUP_EXCEPT:
start = offset+3
start = offset + self.op_size(op)
target = self.get_target(offset, op)
end = self.restrict_to_parent(target, parent)
if target != end:
@@ -630,9 +630,23 @@ class Scanner2(scan.Scanner):
# Now isolate the except and else blocks
end_else = start_else = self.get_target(self.prev[end])
end_finally_offset = end
setup_except_nest = 0
while end_finally_offset < len(self.code):
if self.code[end_finally_offset] == self.opc.END_FINALLY:
if setup_except_nest == 0:
break
else:
setup_except_nest -= 1
elif self.code[end_finally_offset] == self.opc.SETUP_EXCEPT:
setup_except_nest += 1
end_finally_offset += self.op_size(code[end_finally_offset])
pass
# Add the except blocks
i = end
while i < len(self.code) and self.code[i] != self.opc.END_FINALLY:
while i < len(self.code) and i < end_finally_offset:
jmp = self.next_except_jump(i)
if jmp is None: # check
i = self.next_stmt[i]
@@ -924,6 +938,7 @@ class Scanner2(scan.Scanner):
'end': rtarget})
self.thens[start] = rtarget
if self.version == 2.7 or code[pre_rtarget+1] != self.opc.JUMP_FORWARD:
self.fixed_jumps[offset] = rtarget
self.return_end_ifs.add(pre_rtarget)
elif op in self.pop_jump_if_or_pop:

View File

@@ -158,12 +158,15 @@ class Scanner26(scan.Scanner2):
# we sort them). That way, specific COME_FROM tags will match up
# properly. For example, a "loop" with an "if" nested in it should have the
# "loop" tag last so the grammar rule matches that properly.
last_jump_offset = -1
for jump_offset in sorted(jump_targets[offset], reverse=True):
if jump_offset != last_jump_offset:
tokens.append(Token(
'COME_FROM', None, repr(jump_offset),
offset="%s_%d" % (offset, jump_idx),
has_arg = True))
jump_idx += 1
last_jump_offset = jump_offset
elif offset in self.thens:
tokens.append(Token(
'THEN', None, self.thens[offset],
@@ -242,16 +245,21 @@ class Scanner26(scan.Scanner2):
# boundaries The continue-type jumps help us get
# "continue" statements with would otherwise be turned
# into a "pass" statement because JUMPs are sometimes
# ignored in rules as just boundary overhead.
# ignored in rules as just boundary overhead. In
# comprehensions we might sometimes classify JUMP_BACK
# as CONTINUE, but that's okay since we add a grammar
# rule for that.
target = self.get_target(offset)
if target <= offset:
op_name = 'JUMP_BACK'
if (offset in self.stmts
and self.code[offset+3] not in (self.opc.END_FINALLY,
self.opc.POP_BLOCK)
and offset not in self.not_continue):
self.opc.POP_BLOCK)):
if ((offset in self.linestartoffsets and
tokens[-1].type == 'JUMP_BACK')
or offset not in self.not_continue):
op_name = 'CONTINUE'
else:
op_name = 'JUMP_BACK'
# FIXME: this is a hack to catch stuff like:
# if x: continue
# the "continue" is not on a new line.

View File

@@ -126,7 +126,12 @@ class Scanner3(Scanner):
if is_pypy:
varargs_ops.add(self.opc.CALL_METHOD)
if self.version >= 3.6:
varargs_ops.add(self.opc.BUILD_CONST_KEY_MAP)
self.varargs_ops = frozenset(varargs_ops)
# FIXME: remove the above in favor of:
# self.varargs_ops = frozenset(self.opc.hasvargs)
def opName(self, offset):
return self.opc.opname[self.code[offset]]
@@ -326,7 +331,7 @@ class Scanner3(Scanner):
(next_opname not in ('END_FINALLY', 'POP_BLOCK',
# Python 3.0 only uses POP_TOP
'POP_TOP'))):
if (self.version >= 3.5 or
if (self.version >= 3.4 or
(inst.offset not in self.not_continue) or
(tokens[-1].type == 'RETURN_VALUE')):
opname = 'CONTINUE'
@@ -888,7 +893,10 @@ class Scanner3(Scanner):
else:
self.fixed_jumps[offset] = rtarget
self.not_continue.add(pre_rtarget)
else:
# For now, we'll only tag forward jump.
if rtarget > offset:
self.fixed_jumps[offset] = rtarget
elif op == self.opc.SETUP_EXCEPT:
target = self.get_target(offset)

View File

@@ -73,7 +73,7 @@ TABLE_DIRECT = {
'BINARY_MULTIPLY': ( '*' ,),
'BINARY_DIVIDE': ( '/' ,),
'BINARY_MATRIX_MULTIPLY': ( '@' ,),
'BINARY_TRUE_DIVIDE': ( '/' ,),
'BINARY_TRUE_DIVIDE': ( '/' ,), # Not in <= 2.1
'BINARY_FLOOR_DIVIDE': ( '//' ,),
'BINARY_MODULO': ( '%%',),
'BINARY_POWER': ( '**',),
@@ -87,7 +87,7 @@ TABLE_DIRECT = {
'INPLACE_MULTIPLY': ( '*=' ,),
'INPLACE_MATRIX_MULTIPLY': ( '@=' ,),
'INPLACE_DIVIDE': ( '/=' ,),
'INPLACE_TRUE_DIVIDE': ( '/=' ,),
'INPLACE_TRUE_DIVIDE': ( '/=' ,), # Not in <= 2.1; 2.6 generates INPLACE_DIVIDE only?
'INPLACE_FLOOR_DIVIDE': ( '//=' ,),
'INPLACE_MODULO': ( '%%=',),
'INPLACE_POWER': ( '**=',),
@@ -223,7 +223,9 @@ TABLE_DIRECT = {
'elifstmt': ( '%|elif %c:\n%+%c%-', 0, 1 ),
'elifelsestmt': ( '%|elif %c:\n%+%c%-%|else:\n%+%c%-', 0, 1, 3 ),
'ifelsestmtr': ( '%|if %c:\n%+%c%-%|else:\n%+%c%-', 0, 1, 2 ),
'ifelsestmtr2': ( '%|if %c:\n%+%c%-%|else:\n%+%c%-\n\n', 0, 1, 3 ), # has COME_FROM
'elifelsestmtr': ( '%|elif %c:\n%+%c%-%|else:\n%+%c%-\n\n', 0, 1, 2 ),
'elifelsestmtr2': ( '%|elif %c:\n%+%c%-%|else:\n%+%c%-\n\n', 0, 1, 3 ), # has COME_FROM
'whileTruestmt': ( '%|while True:\n%+%c%-\n\n', 1 ),
'whilestmt': ( '%|while %c:\n%+%c%-\n\n', 1, 2 ),
@@ -249,8 +251,6 @@ TABLE_DIRECT = {
'except_cond1': ( '%|except %c:\n', 1 ),
'except_suite': ( '%+%c%-%C', 0, (1, maxint, '') ),
'except_suite_finalize': ( '%+%c%-%C', 1, (3, maxint, '') ),
'withstmt': ( '%|with %c:\n%+%c%-', 0, 3),
'withasstmt': ( '%|with %c as %c:\n%+%c%-', 0, 2, 3),
'passstmt': ( '%|pass\n', ),
'STORE_FAST': ( '%{pattr}', ),
'kv': ( '%c: %c', 3, 1 ),

View File

@@ -1,4 +1,4 @@
# Copyright (c) 2015, 2016 by Rocky Bernstein
# Copyright (c) 2015-2017 by Rocky Bernstein
# Copyright (c) 2005 by Dan Pascu <dan@windowmaker.org>
# Copyright (c) 2000-2002 by hartmut Goebel <h.goebel@crazy-compilers.com>
# Copyright (c) 1999 John Aycock
@@ -241,6 +241,9 @@ class SourceWalker(GenericASTTraversal, object):
TABLE_DIRECT.update({
'importmultiple': ( '%|import %c%c\n', 2, 3 ),
'import_cont' : ( ', %c', 2 ),
# With/as is allowed as "from future" thing in 2.5
'withstmt': ( '%|with %c:\n%+%c%-', 0, 3),
'withasstmt': ( '%|with %c as %c:\n%+%c%-', 0, 2, 3),
})
########################################
@@ -723,11 +726,18 @@ class SourceWalker(GenericASTTraversal, object):
n_ifelsestmtc = n_ifelsestmtl = n_ifelsestmt
def n_ifelsestmtr(self, node):
if len(node[2]) != 2:
if node[2] == 'COME_FROM':
return_stmts_node = node[3]
node.type = 'ifelsestmtr2'
else:
return_stmts_node = node[2]
if len(return_stmts_node) != 2:
self.default(node)
if not (node[2][0][0][0] == 'ifstmt' and node[2][0][0][0][1][0] == 'return_if_stmts') \
and not (node[2][0][-1][0] == 'ifstmt' and node[2][0][-1][0][1][0] == 'return_if_stmts'):
if (not (return_stmts_node[0][0][0] == 'ifstmt'
and return_stmts_node[0][0][0][1][0] == 'return_if_stmts')
and not (return_stmts_node[0][-1][0] == 'ifstmt'
and return_stmts_node[0][-1][0][1][0] == 'return_if_stmts')):
self.default(node)
return
@@ -739,13 +749,14 @@ class SourceWalker(GenericASTTraversal, object):
self.indentLess()
if_ret_at_end = False
if len(node[2][0]) >= 3:
if node[2][0][-1][0] == 'ifstmt' and node[2][0][-1][0][1][0] == 'return_if_stmts':
if len(return_stmts_node[0]) >= 3:
if (return_stmts_node[0][-1][0] == 'ifstmt'
and return_stmts_node[0][-1][0][1][0] == 'return_if_stmts'):
if_ret_at_end = True
past_else = False
prev_stmt_is_if_ret = True
for n in node[2][0]:
for n in return_stmts_node[0]:
if (n[0] == 'ifstmt' and n[0][1][0] == 'return_if_stmts'):
if prev_stmt_is_if_ret:
n[0].type = 'elifstmt'
@@ -760,15 +771,22 @@ class SourceWalker(GenericASTTraversal, object):
if not past_else or if_ret_at_end:
self.println(self.indent, 'else:')
self.indentMore()
self.preorder(node[2][1])
self.preorder(return_stmts_node[1])
self.indentLess()
self.prune()
n_ifelsestmtr2 = n_ifelsestmtr
def n_elifelsestmtr(self, node):
if len(node[2]) != 2:
if node[2] == 'COME_FROM':
return_stmts_node = node[3]
node.type = 'elifelsestmtr2'
else:
return_stmts_node = node[2]
if len(return_stmts_node) != 2:
self.default(node)
for n in node[2][0]:
for n in return_stmts_node[0]:
if not (n[0] == 'ifstmt' and n[0][1][0] == 'return_if_stmts'):
self.default(node)
return
@@ -780,12 +798,12 @@ class SourceWalker(GenericASTTraversal, object):
self.preorder(node[1])
self.indentLess()
for n in node[2][0]:
for n in return_stmts_node[0]:
n[0].type = 'elifstmt'
self.preorder(n)
self.println(self.indent, 'else:')
self.indentMore()
self.preorder(node[2][1])
self.preorder(return_stmts_node[1])
self.indentLess()
self.prune()
@@ -1426,6 +1444,17 @@ class SourceWalker(GenericASTTraversal, object):
i += 3
pass
pass
elif node[-1].type.startswith('BUILD_CONST_KEY_MAP'):
# Python 3.6+ style const map
keys = node[-2].pattr
values = node[:-2]
# FIXME: Line numbers?
for key, value in zip(keys, values):
self.write(repr(key))
self.write(':')
self.write(self.traverse(value[0]))
self.write(',')
pass
pass
else:
# Python 2 style kvlist

View File

@@ -201,11 +201,10 @@ def cmp_code_objects(version, is_pypy, code_obj1, code_obj2,
else:
import uncompyle6.scanners.scanner27 as scan
scanner = scan.Scanner27()
elif version == 3.0:
import uncompyle6.scanners.scanner30 as scan
scanner = scan.Scanner30()
elif version == 3.1:
if is_pypy:
import uncompyle6.scanners.pypy31 as scan
scanner = scan.ScannerPyPy31()
else:
import uncompyle6.scanners.scanner32 as scan
scanner = scan.Scanner32()
elif version == 3.2:
@@ -319,8 +318,14 @@ def cmp_code_objects(version, is_pypy, code_obj1, code_obj2,
elif tokens1[i1].type == 'LOAD_NAME' and tokens2[i2].type == 'LOAD_CONST' \
and tokens1[i1].pattr == 'None' and tokens2[i2].pattr is None:
pass
elif tokens1[i1].type == 'RETURN_VALUE' and \
tokens2[i2].type == 'RETURN_END_IF':
elif tokens1[i1].type == 'LOAD_GLOBAL' and tokens2[i2].type == 'LOAD_NAME' \
and tokens1[i1].pattr == tokens2[i2].pattr:
pass
elif (tokens1[i1].type == 'RETURN_VALUE' and
tokens2[i2].type == 'RETURN_END_IF'):
pass
elif (tokens1[i1].type == 'BUILD_TUPLE_0' and
tokens2[i2].pattr == ()):
pass
else:
raise CmpErrorCode(name, tokens1[i1].offset, tokens1[i1],

View File

@@ -1,3 +1,3 @@
# This file is suitable for sourcing inside bash as
# well as importing into Python
VERSION='2.9.9'
VERSION='2.9.10'