Sync with master

This commit is contained in:
rocky
2017-12-02 09:51:15 -05:00
parent 0d9464bb92
commit 0b9fca2263
89 changed files with 619 additions and 434 deletions

View File

@@ -114,10 +114,18 @@ is done by transforming the byte code into a pseudo-2.7 Python
bytecode and is based on code from Eloi Vanderbeken.
This project, `uncompyle6`, abandons that approach for various
reasons. However the main reason is that we need offsets in fragment
deparsing to be exactly the same, and the transformation process can
remove instructions. _Adding_ instructions with psuedo offsets is
however okay.
reasons. Having a grammar per Python version is much cleaner and it
scales indefinitely. That said, we don't have entire copies of the
grammar, but work off of differences from some neighboring version.
And this too I find helpful. Should there be a desire to rebase or
start a new base version to work off of, say for some future Python
version, that can be done by dumping a grammar for a specific version
after it has been loaded incrementally.
Another problem with pseudo-2.7 bytecode is that that we need offsets
in fragment deparsing to be exactly the same as the bytecode; the
transformation process can remove instructions. _Adding_ instructions
with psuedo offsets is however okay.
`Uncompyle6` however owes its existence to the fork of `uncompyle2` by
Myst herie (Mysterie) whose first commit picks up at
@@ -159,21 +167,37 @@ if the grammar is LR or left recursive.
Another approach that doesn't use grammars is to do something like
simulate execution symbolically and build expression trees off of
stack results. Control flow in that apprproach still needs to be
handled somewhat ad hoc. The two important projects that work this
way are [unpyc3](https://code.google.com/p/unpyc3/) and most
especially [pycdc](https://github.com/zrax/pycdc) The latter project
is largely by Michael Hansen and Darryl Pogue. If they supported
getting source-code fragments, did a better job in supporting Python
more fully, and had a way I could call it from Python, I'd probably
would have ditched this and used that. The code runs blindingly fast
and spans all versions of Python, although more recently Python 3
support has been lagging.
stack results. Control flow in that approach still needs to be handled
somewhat ad hoc. The two important projects that work this way are
[unpyc3](https://code.google.com/p/unpyc3/) and most especially
[pycdc](https://github.com/zrax/pycdc) The latter project is largely
by Michael Hansen and Darryl Pogue. If they supported getting
source-code fragments, did a better job in supporting Python more
fully, and had a way I could call it from Python, I'd probably would
have ditched this and used that. The code runs blindingly fast and
spans all versions of Python, although more recently Python 3 support
has been lagging. The code is impressive for its smallness given that
it covers many versions of Python. However, I think it has reached a
scalability issue, same as all the other efforts. For it to handle
Python versions more accurately, I think it will need to have a lot
more code specially which specialize for Python versions.
Tests for the project have been, or are being, culled from all of the
projects mentioned.
projects mentioned. Quite a few have been added to improve grammar
coverage and to address the numerous bugs that have been encountered.
For a little bit of the history of changes to the Early-algorithm parser,
If you think, as I am sure will happen in the future, "hey, I can just
write a decompiler from scratch and not have to deal with all all of
the complexity here", think again. What is likely to happen is that
you'll get at best a 90% solution working for a single Python release
that will be obsolete in about a year, and more obsolute each
subsequent year. Writing a decompiler for Python gets harder as it
Python progresses, so writing one for Python 3.7 isn't as easy as it
was for Python 2.2. That said, if you still feel you want to write a
single version decompiler, talk to me. I may have some ideas.
For a little bit of the history of changes to the Earley-algorithm parser,
see the file [NEW-FEATURES.rst](https://github.com/rocky/python-spark/blob/master/NEW-FEATURES.rst) in the [python-spark github repository](https://github.com/rocky/python-spark).
NB. If you find mistakes, want corrections, or want your name added

View File

@@ -3,7 +3,7 @@
uncompyle6
==========
A native Python cross-version Decompiler and Fragment Decompiler.
A native Python cross-version decompiler and fragment decompiler.
The successor to decompyle, uncompyle, and uncompyle2.
@@ -17,16 +17,17 @@ source code. It accepts bytecodes from Python version 1.5, and 2.1 to
Why this?
---------
Ok, I'll say it: this software is amazing. It is a little more than
just your normal hacky decompiler. Using compiler_ technology, the
programs creates a parse tree of the program from the instructions;
nodes at the upper levels that look like they come from a Python
Ok, I'll say it: this software is amazing. It is more than your
normal hacky decompiler. Using compiler_ technology, the program
creates a parse tree of the program from the instructions; nodes at
the upper levels that look a little like what might come from a Python
AST. So we can really classify and understand what's going on in
sections of instructions.
sections of Python bytecode.
So another thing that makes this different from other CPython bytecode
decompilers is the ability to deparse just *fragments* of source code
and give source-code information around a given bytecode offset.
Building on this, another thing that makes this different from other
CPython bytecode decompilers is the ability to deparse just
*fragments* of source code and give source-code information around a
given bytecode offset.
I use the tree fragments to deparse fragments of code inside my
trepan_ debuggers_. For that, bytecode offsets are recorded and
@@ -34,17 +35,22 @@ associated with fragments of the source code. This purpose, although
compatible with the original intention, is yet a little bit different.
See this_ for more information.
The idea of Python fragment deparsing given an instruction offset can
be used in showing stack traces or any program that wants to show a
location in more detail than just a line number. It can be also used
when source-code information does not exist and there is just bytecode
Python fragment deparsing given an instruction offset is useful in
showing stack traces and can be encorporated into any program that
wants to show a location in more detail than just a line number at
runtime. This code can be also used when source-code information does
not exist and there is just bytecode. Again, my debuggers make use of
this.
There were (and still are) a number of decompyle, uncompyle, uncompyle2,
uncompyle3 forks around. Almost all of them come basically from the
same code base, and (almost?) all of them are no longer actively
maintained. Only one handled Python 3, and even there, only 3.2 or 3.3
depending on which code is used. This code pulls these together and
moves forward.
There were (and still are) a number of decompyle, uncompyle,
uncompyle2, uncompyle3 forks around. Almost all of them come basically
from the same code base, and (almost?) all of them are no longer
actively maintained. One was really good at decompiling Python 1.5-2.3
or so, another really good at Python 2.7, but that only. Another
handles Python 3.2 only; another patched that and handled only 3.3.
You get the idea. This code pulls all of these forks together and
*moves forward*. There is some serious refactoring and cleanup in this
code base over those old forks.
This project has the most complete support for Python 3.3 and above
and the best all-around Python support.

View File

@@ -0,0 +1,19 @@
# -*- shell-script -*-
# Sets PYVERSIONS to be all pyenv versions we have
if [[ $0 == ${BASH_SOURCE[0]} ]] ; then
echo "This script should be *sourced* rather than run directly through bash"
exit 1
fi
olddir=$(pwd)
mydir=$(dirname ${BASH_SOURCE[0]})
cd $mydir
all=""
for file in pyenv-{olde{st,r},newer}-versions ; do
. $mydir/$file
all="$all $PYVERSIONS"
done
PYVERSIONS="$all"
cd $olddir

View File

@@ -1,6 +1,8 @@
# -*- shell-script -*-
# Sets PYVERSIONS to be pyenv versions that
# we can use in the master branch.
if [[ $0 == ${BASH_SOURCE[0]} ]] ; then
echo "This script should be *sourced* rather than run directly through bash"
exit 1
fi
export PYVERSIONS='3.5.2 3.6.2 2.6.9 3.3.6 2.7.14 3.4.2'
export PYVERSIONS='3.5.3 3.6.3 2.6.9 3.3.6 2.7.14 3.4.2'

View File

@@ -1,4 +1,7 @@
# -*- shell-script -*-
# Sets PYVERSIONS to be pyenv versions that
# we can use in the python-2.4 branch.
if [[ $0 == ${BASH_SOURCE[0]} ]] ; then
echo "This script should be *sourced* rather than run directly through bash"
exit 1

View File

@@ -0,0 +1,9 @@
# -*- shell-script -*-
# Sets PYVERSIONS to be all pyenv the oldest versions we have.
# These are not covered (yet) by uncompyle6, although
# some programs do work here.
if [[ $0 == ${BASH_SOURCE[0]} ]] ; then
echo "This script should be *sourced* rather than run directly through bash"
exit 1
fi
export PYVERSIONS='2.1.3 2.2.3 2.3.7'

View File

@@ -17,25 +17,38 @@ def test_grammar():
(lhs, rhs, tokens,
right_recursive, dup_rhs) = p.check_sets()
expect_lhs = set(['expr1024', 'pos_arg'])
unused_rhs = set(['build_list', 'call', 'mkfunc',
unused_rhs = set(['list', 'mkfunc',
'mklambda',
'unpack',])
expect_right_recursive = frozenset([('designList',
('store', 'DUP_TOP', 'designList'))])
if PYTHON3:
expect_lhs.add('load_genexpr')
expect_lhs.add('kvlist')
expect_lhs.add('kv3')
unused_rhs = unused_rhs.union(set("""
except_pop_except generator_exp classdefdeco2 listcomp
except_pop_except generator_exp classdefdeco2
dict
""".split()))
if 3.0 <= PYTHON_VERSION:
if PYTHON_VERSION >= 3.0:
expect_lhs.add("annotate_arg")
expect_lhs.add("annotate_tuple")
unused_rhs.add("mkfunc_annotate")
if PYTHON_VERSION != 3.6:
# 3.6 has at least one non-custom call rule
# the others don't
unused_rhs.add('call')
else:
# These are custom rules set in 3.5
unused_rhs.add('build_map_unpack_with_call')
unused_rhs.add('unmapexpr')
pass
pass
pass
else:
expect_lhs.add('kwarg')
unused_rhs.add('call')
assert expect_lhs == set(lhs)
assert unused_rhs == set(rhs)

View File

@@ -1,4 +1,12 @@
PHONY=check clean dist distclean test test-unit test-functional rmChangeLog clean_pyc nosetests
PHONY=check clean dist distclean test test-unit test-functional rmChangeLog clean_pyc nosetests \
check-bytecode-1.5 check-bytecode-1 check-bytecode-2 check-bytecode-3 \
check-bytecode-2.2 check-byteocde-2.3 check-bytecode-2.4 \
check-short check-2.6 check-2.7 check-3.0 check-3.1 check-3.2 check-3.3 \
check-3.4 check-3.5 check-5.6 5.6 5.8 \
grammar-coverage-2.5 grammar-coverage-2.6 grammarcoverage-2.7 \
grammar-coverage-3.1 grammar-coverage-3.2 grammarcoverage-3.3 \
grammar-coverage-3.4 grammar-coverage-3.5 grammarcoverage-3.6
GIT2CL ?= git2cl
PYTHON ?= python
@@ -59,8 +67,7 @@ check-disasm:
$(PYTHON) dis-compare.py
#: Check deparsing bytecode 1.x only
check-bytecode-1:
$(PYTHON) test_pythonlib.py --bytecode-1.5
check-bytecode-1: check-bytecode-1.5
#: Check deparsing bytecode 2.x only
check-bytecode-2:
@@ -82,6 +89,10 @@ check-bytecode: check-bytecode-3
--bytecode-pypy2.7 --bytecode-1
#: Check deparsing bytecode 1.5 only
check-bytecode-1.5:
$(PYTHON) test_pythonlib.py --bytecode-1.5
#: Check deparsing Python 2.1
check-bytecode-2.1:
$(PYTHON) test_pythonlib.py --bytecode-2.1
@@ -130,6 +141,19 @@ grammar-coverage-2.7:
SPARK_PARSER_COVERAGE=$(COVER_DIR)/spark-grammar-27.cover $(PYTHON) test_pythonlib.py --bytecode-2.7
SPARK_PARSER_COVERAGE=$(COVER_DIR)/spark-grammar-27.cover $(PYTHON) test_pyenvlib.py --2.7.13
#: Get grammar coverage for Python 3.0
grammar-coverage-3.0:
-rm $(COVER_DIR)/spark-grammar-32.cover
SPARK_PARSER_COVERAGE=$(COVER_DIR)/spark-grammar-30.cover $(PYTHON) test_pythonlib.py --bytecode-3.1
SPARK_PARSER_COVERAGE=$(COVER_DIR)/spark-grammar-30.cover $(PYTHON) test_pyenvlib.py --3.0.1
#: Get grammar coverage for Python 3.1
grammar-coverage-3.1:
-rm $(COVER_DIR)/spark-grammar-32.cover
SPARK_PARSER_COVERAGE=$(COVER_DIR)/spark-grammar-31.cover $(PYTHON) test_pythonlib.py --bytecode-3.1
SPARK_PARSER_COVERAGE=$(COVER_DIR)/spark-grammar-31.cover $(PYTHON) test_pyenvlib.py --3.1.5
#: Get grammar coverage for Python 3.2
grammar-coverage-3.2:
-rm $(COVER_DIR)/spark-grammar-32.cover
@@ -152,7 +176,7 @@ grammar-coverage-3.4:
grammar-coverage-3.5:
rm $(COVER_DIR)/spark-grammar-35.cover || /bin/true
SPARK_PARSER_COVERAGE=$(COVER_DIR)/spark-grammar-35.cover $(PYTHON) test_pythonlib.py --bytecode-3.5
SPARK_PARSER_COVERAGE=$(COVER_DIR)/spark-grammar-35.cover $(PYTHON) test_pyenvlib.py --3.5.2
SPARK_PARSER_COVERAGE=$(COVER_DIR)/spark-grammar-35.cover $(PYTHON) test_pyenvlib.py --3.5.3
#: Check deparsing Python 2.6
pcheck-bytecode-2.6:

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

View File

@@ -12,7 +12,7 @@ y ^= 1 # INPLACE_XOR
`y` # UNARY_CONVERT - No in Python 3.x
# Beef up augassign and STORE_SLICE+3
# Beef up aug_assign and STORE_SLICE+3
x = [1,2,3,4,5]
x[0:1] = 1
x[0:3] += 1, 2, 3

View File

@@ -1,7 +1,7 @@
# From 2.6.9 abc.py
# For 2.6:
# genexpr_func ::= setup_loop_lf FOR_ITER designator comp_iter JUMP_BACK come_from_pop JUMP_BACK POP_BLOCK COME_FROM
# genexpr_func ::= setup_loop_lf FOR_ITER store comp_iter JUMP_BACK come_from_pop JUMP_BACK POP_BLOCK COME_FROM
# This has been a bug in other Pythons after 2.6 were set comprehension {} is used instead of set().
abstracts = set(name

View File

@@ -1,5 +1,8 @@
# Statements to beef up grammar coverage rules
# Force "inplace" ops
# Note this is like simple_source/bug22/01_ops.py
# But we don't ahve the UNARY_CONVERT which dropped
# out around 2.7
y = +10 # UNARY_POSITIVE
y /= 1 # INPLACE_DIVIDE
y %= 4 # INPLACE_MODULO
@@ -10,7 +13,7 @@ y //= 1 # INPLACE_TRUE_DIVIDE
y &= 1 # INPLACE_AND
y ^= 1 # INPLACE_XOR
# Beef up augassign and STORE_SLICE+3
# Beef up aug_assign and STORE_SLICE+3
x = [1,2,3,4,5]
x[0:1] = 1
x[0:3] += 1, 2, 3

View File

@@ -0,0 +1,22 @@
# From python 3.4 asyncio/base_events.py
# Needs a forelselast grammar rule
def create_connection(self, infos, f2, laddr_infos, protocol):
for family in infos:
try:
if f2:
for laddr in laddr_infos:
try:
break
except OSError:
protocol = 'foo'
else:
continue
except OSError:
protocol = 'bar'
else:
break
else:
raise
return protocol

View File

@@ -0,0 +1,9 @@
# Python 3.6 changes, yet again, the way deafult pairs are handled
def foo1(bar, baz=1):
return 1
def foo2(bar, baz, qux=1):
return 2
def foo3(bar, baz=1, qux=2):
return 3
def foo4(bar, baz, qux=1, quux=2):
return 4

View File

@@ -3,7 +3,7 @@
# Python2 grammar includes:
# list_compr ::= BUILD_LIST_0 list_iter
# list_iter ::= list_for
# list_for ::= expr _for designator list_iter JUMP_BACK
# list_for ::= expr _for store list_iter JUMP_BACK
# list_iter ::= lc_body
# lc_body ::= expr LIST_APPEND
#

View File

@@ -5,7 +5,7 @@
# and ::= expr jmp_false expr \e__come_from
# expr ::= list_compr
# list_iter ::= list_for
# list_for ::= expr _for designator list_iter JUMP_BACK
# list_for ::= expr _for store list_iter JUMP_BACK
# list_iter ::= lc_body
# lc_body ::= expr LIST_APPEND
@@ -19,9 +19,9 @@
# Python2:
# list_compr ::= BUILD_LIST_0 list_iter
# list_iter ::= list_for
# list_for ::= expr _for designator list_iter JUMP_BACK
# list_for ::= expr _for store list_iter JUMP_BACK
# list_iter ::= list_for
# list_for ::= expr _for designator list_iter JUMP_BACK
# list_for ::= expr _for store list_iter JUMP_BACK
# list_iter ::= lc_body
# lc_body ::= expr LIST_APPEND
# [ i * j for i in range(4) for j in range(7) ]

View File

@@ -6,7 +6,7 @@
# 76 JUMP_ABSOLUTE 17 (to 17)
# And getting:
# list_for ::= expr _for designator list_iter JUMP_BACK
# list_for ::= expr _for store list_iter JUMP_BACK
# list_iter ::= list_if JUMP_BACK
# ^^^^^^^^^ added to 2.6 grammar
# list_iter ::= list_for

View File

@@ -1,11 +1,11 @@
# Tests:
#
# For Python3:
# classdef ::= LOAD_BUILD_CLASS mkfunc LOAD_CONST CALL_FUNCTION_2 designator
# classdef ::= LOAD_BUILD_CLASS mkfunc LOAD_CONST CALL_FUNCTION_2 store
# mkfunc ::= LOAD_CONST LOAD_CONST MAKE_FUNCTION_0
# For Python2:
# classdef ::= LOAD_CONST expr mkfunc CALL_FUNCTION_0 BUILD_CLASS designator
# classdef ::= LOAD_CONST expr mkfunc CALL_FUNCTION_0 BUILD_CLASS store
# mkfunc ::= LOAD_CONST MAKE_FUNCTION_0
class A:

View File

@@ -1,14 +1,14 @@
# Tests
# Python3:
# funcdef ::= mkfunc designator
# designator ::= STORE_DEREF
# function_def ::= mkfunc store
# store ::= STORE_DEREF
# mkfunc ::= load_closure BUILD_TUPLE_1 LOAD_CONST LOAD_CONST MAKE_CLOSURE_0
# load_closure ::= LOAD_CLOSURE
#
# Python2:
# funcdef ::= mkfunc designator
# designator ::= STORE_DEREF
# function_def ::= mkfunc store
# store ::= STORE_DEREF
# mkfunc ::= load_closure LOAD_CONST MAKE_CLOSURE_0
# load_closure ::= LOAD_CLOSURE

View File

@@ -1,10 +1,10 @@
# Tests:
# importstmt ::= LOAD_CONST LOAD_CONST import_as
# import_as ::= IMPORT_NAME designator
# import_as ::= IMPORT_NAME store
# Since Python 3.3:
# classdef ::= buildclass designator
# designator ::= STORE_NAME
# classdef ::= buildclass store
# store ::= STORE_NAME
# buildclass ::= LOAD_BUILD_CLASS mkfunc LOAD_CONST expr CALL_FUNCTION_3
# mkfunc ::= LOAD_CONST LOAD_CONST MAKE_FUNCTION_0

View File

@@ -13,8 +13,8 @@
# mkfuncdeco0 ::= mkfunc
# mkfuncdeco ::= expr mkfuncdeco0 CALL_FUNCTION_1
# designator ::= STORE_FAST
# funcdefdeco ::= mkfuncdeco designator
# store ::= STORE_FAST
# funcdefdeco ::= mkfuncdeco store
# stmt ::= funcdefdeco

View File

@@ -13,8 +13,8 @@
# mkfuncdeco0 ::= mkfunc
# mkfuncdeco ::= expr mkfuncdeco0 CALL_FUNCTION_1
# designator ::= STORE_FAST
# funcdefdeco ::= mkfuncdeco designator
# store ::= STORE_FAST
# funcdefdeco ::= mkfuncdeco store
# stmt ::= funcdefdeco
from functools import wraps

View File

@@ -5,8 +5,8 @@
# mkfuncdeco0 ::= mkfunc
# classdefdeco2 ::= LOAD_CONST expr mkfunc CALL_FUNCTION_0 BUILD_CLASS
# classdefdeco1 ::= expr classdefdeco1 CALL_FUNCTION_1
# designator ::= STORE_NAME
# classdefdeco ::= classdefdeco1 designator
# store ::= STORE_NAME
# classdefdeco ::= classdefdeco1 store
def author(*author_names):
def author_func(cls):

View File

@@ -1,5 +1,5 @@
# Tests:
# forstmt ::= SETUP_LOOP expr _for designator for_block POP_BLOCK COME_FROM
# forstmt ::= SETUP_LOOP expr _for store for_block POP_BLOCK COME_FROM
# for_block ::= l_stmts_opt JUMP_BACK
# trystmt ::= SETUP_EXCEPT suite_stmts_opt POP_BLOCK try_middle COME_FROM
# try_middle ::= jmp_abs COME_FROM except_stmts END_FINALLY

View File

@@ -4,16 +4,16 @@
# get_iter ::= expr GET_ITER
# expr ::= get_iter
# _for ::= GET_ITER FOR_ITER
# designator ::= STORE_FAST
# store ::= STORE_FAST
# expr ::= LOAD_FAST
# yield ::= expr YIELD_VALUE
# expr ::= yield
# gen_comp_body ::= expr YIELD_VALUE POP_TOP
# comp_body ::= gen_comp_body
# comp_iter ::= comp_body
# comp_for ::= expr _for designator comp_iter JUMP_BACK
# comp_for ::= expr _for store comp_iter JUMP_BACK
# comp_iter ::= comp_for
# genexpr_func ::= LOAD_FAST FOR_ITER designator comp_iter JUMP_BACK
# genexpr_func ::= LOAD_FAST FOR_ITER store comp_iter JUMP_BACK
def multi_genexpr(blog_posts):

View File

@@ -1,5 +1,5 @@
# Tests:
# forstmt ::= SETUP_LOOP expr _for designator
# forstmt ::= SETUP_LOOP expr _for store
# for_block POP_BLOCK COME_FROM
for a in [1]:
c = 2

View File

@@ -1,5 +1,5 @@
# Tests:
# assign ::= expr designator
# assign ::= expr store
a = 'None'
b = None

View File

@@ -1,6 +1,9 @@
# Tests:
# Tests all the different kinds of imports
import sys
from os import path
from os import *
import time as time1, os as os1
import http.client as httpclient
if len(__file__) == 0:
# a.b.c should force consecutive LOAD_ATTRs
import a.b.c as d

View File

@@ -1,3 +1,3 @@
# Tests:
# assign ::= expr designator
# assign ::= expr store
pass

View File

@@ -26,11 +26,11 @@ l[1][2][3] = 7
l[1][2][3] *= 3;
# Python 2.x
# augassign1 ::= expr expr inplace_op ROT_TWO STORE_SLICE+0
# aug_assign1 ::= expr expr inplace_op ROT_TWO STORE_SLICE+0
l[:] += [9]; # print l
# Python 2.x
# augassign1 ::= expr expr inplace_op ROT_THREE STORE_SLICE+2
# aug_assign1 ::= expr expr inplace_op ROT_THREE STORE_SLICE+2
l[:2] += [9]; # print l

View File

@@ -0,0 +1,13 @@
# exec.py -- source test pattern for exec statement
#
# This simple program is part of the decompyle test suite.
#
# decompyle is a Python byte-code decompiler
# See http://www.goebel-consult.de/decompyle/ for download and
# for further information
testcode = 'a = 12'
exec testcode
exec testcode in globals()
exec testcode in globals(), locals()

View File

@@ -2,7 +2,7 @@
# Bug in 2.6.9 was handling with as. Added rules
#
# withasstmt ::= expr setupwithas designator suite_stmts_opt
# withasstmt ::= expr setupwithas store suite_stmts_opt
# POP_BLOCK LOAD_CONST COME_FROM WITH_CLEANUP END_FINALLY
# setupwithas ::= DUP_TOP LOAD_ATTR ROT_TWO LOAD_ATTR CALL_FUNCTION_0 STORE_FAST
# SETUP_FINALLY LOAD_FAST DELETE_FAST

View File

@@ -1,6 +1,9 @@
# Ensures opcodes DELETE_SUBSCR and DELETE_GLOBAL are covered
a = (1, 2, 3)
# DELETE_NAME
del a
# DELETE_SUBSCR
b = [4, 5, 6]
del b[1]
del b[:]
@@ -14,5 +17,14 @@ del d[1:3:2]
e = ('a', 'b')
def foo():
# covers DELETE_GLOBAL
global e
del e
def a():
del z
def b(y):
# covers DELETE_FAST
del y
# LOAD_DEREF
return z

View File

@@ -31,7 +31,7 @@ TEST_VERSIONS=('2.3.7', '2.4.6', '2.5.6', '2.6.9',
'2.7.10', '2.7.11', '2.7.12', '2.7.13', '2.7.14',
'3.0.1', '3.1.5', '3.2.6',
'3.3.5', '3.3.6',
'3.4.2', '3.5.1', '3.5.2', '3.6.0', '3.6.3',
'3.4.2', '3.5.3', '3.6.0', '3.6.3',
'native')
target_base = '/tmp/py-dis/'

View File

@@ -167,8 +167,8 @@ class PythonParser(GenericASTBuilder):
return GenericASTBuilder.ambiguity(self, children)
def resolve(self, list):
if len(list) == 2 and 'funcdef' in list and 'assign' in list:
return 'funcdef'
if len(list) == 2 and 'function_def' in list and 'assign' in list:
return 'function_def'
if 'grammar' in list and 'expr' in list:
return 'expr'
# print >> sys.stderr, 'resolve', str(list)
@@ -180,8 +180,7 @@ class PythonParser(GenericASTBuilder):
def p_start(self, args):
'''
# The start or goal symbol
stmts ::= stmts sstmt
stmts ::= sstmt
stmts ::= sstmt+
'''
def p_call_stmt(self, args):
@@ -267,7 +266,6 @@ class PythonParser(GenericASTBuilder):
stmt ::= return_stmt
return_stmt ::= ret_expr RETURN_VALUE
return_stmt_lambda ::= ret_expr RETURN_VALUE_LAMBDA
# return_stmts are a sequence of statements that ends in a RETURN statement.
# In later Python versions with jump optimization, this can cause JUMPs
@@ -279,10 +277,10 @@ class PythonParser(GenericASTBuilder):
"""
pass
def p_funcdef(self, args):
def p_function_def(self, args):
'''
stmt ::= funcdef
funcdef ::= mkfunc store
stmt ::= function_def
function_def ::= mkfunc store
stmt ::= funcdefdeco
funcdefdeco ::= mkfuncdeco store
mkfuncdeco ::= expr mkfuncdeco CALL_FUNCTION_1
@@ -317,16 +315,17 @@ class PythonParser(GenericASTBuilder):
def p_augmented_assign(self, args):
'''
stmt ::= augassign1
stmt ::= augassign2
stmt ::= aug_assign1
stmt ::= aug_assign2
# This is odd in that other augassign1's have only 3 slots
# This is odd in that other aug_assign1's have only 3 slots
# The store isn't used as that's supposed to be also
# indicated in the first expr
augassign1 ::= expr expr inplace_op store
augassign1 ::= expr expr inplace_op ROT_THREE STORE_SUBSCR
augassign2 ::= expr DUP_TOP LOAD_ATTR expr
aug_assign1 ::= expr expr
inplace_op store
aug_assign1 ::= expr expr
inplace_op ROT_THREE STORE_SUBSCR
aug_assign2 ::= expr DUP_TOP LOAD_ATTR expr
inplace_op ROT_TWO STORE_ATTR
inplace_op ::= INPLACE_ADD
@@ -376,28 +375,31 @@ class PythonParser(GenericASTBuilder):
def p_import20(self, args):
"""
stmt ::= importstmt
stmt ::= importfrom
stmt ::= importstar
stmt ::= import
stmt ::= import_from
stmt ::= import_from_star
stmt ::= importmultiple
importlist ::= importlist import_as
importlist ::= import_as
import_as ::= IMPORT_NAME store
import_as ::= IMPORT_FROM store
importlist ::= importlist alias
importlist ::= alias
alias ::= IMPORT_NAME store
alias ::= IMPORT_FROM store
alias ::= IMPORT_NAME load_attrs store
importstmt ::= LOAD_CONST LOAD_CONST import_as
importstar ::= LOAD_CONST LOAD_CONST IMPORT_NAME IMPORT_STAR
importfrom ::= LOAD_CONST LOAD_CONST IMPORT_NAME importlist POP_TOP
importmultiple ::= LOAD_CONST LOAD_CONST import_as imports_cont
import ::= LOAD_CONST LOAD_CONST alias
import_from_star ::= LOAD_CONST LOAD_CONST IMPORT_NAME IMPORT_STAR
import_from ::= LOAD_CONST LOAD_CONST IMPORT_NAME importlist POP_TOP
importmultiple ::= LOAD_CONST LOAD_CONST alias imports_cont
imports_cont ::= import_cont+
import_cont ::= LOAD_CONST LOAD_CONST import_as
import_cont ::= LOAD_CONST LOAD_CONST alias
load_attrs ::= LOAD_ATTR+
"""
def p_list_comprehension(self, args):
"""
expr ::= list_compr
expr ::= list_comp
list_iter ::= list_for
list_iter ::= list_if
@@ -408,7 +410,7 @@ class PythonParser(GenericASTBuilder):
list_if_not ::= expr jmp_true list_iter
"""
def p_setcomp(self, args):
def p_set_comp(self, args):
"""
comp_iter ::= comp_for
comp_iter ::= comp_body
@@ -429,9 +431,9 @@ class PythonParser(GenericASTBuilder):
expr ::= LOAD_DEREF
expr ::= load_attr
expr ::= binary_expr
expr ::= build_list
expr ::= list
expr ::= compare
expr ::= mapexpr
expr ::= dict
expr ::= and
expr ::= or
expr ::= unary_expr
@@ -482,14 +484,10 @@ class PythonParser(GenericASTBuilder):
ret_expr_or_cond ::= ret_cond
stmt ::= return_lambda
stmt ::= conditional_lambda
return_lambda ::= ret_expr RETURN_VALUE_LAMBDA LAMBDA_MARKER
return_lambda ::= ret_expr RETURN_VALUE_LAMBDA
# Doesn't seem to be used anymore, but other conditional_lambda's are
# conditional_lambda ::= expr jmp_false return_if_stmt return_stmt LAMBDA_MARKER
compare ::= compare_chained
compare ::= compare_single
compare_single ::= expr expr COMPARE_OP
@@ -501,9 +499,6 @@ class PythonParser(GenericASTBuilder):
# Non-null kvlist items are broken out in the indiviual grammars
kvlist ::=
exprlist ::= exprlist expr
exprlist ::= expr
# Positional arguments in make_function
pos_arg ::= expr

View File

@@ -13,11 +13,11 @@ class Python15Parser(Python21Parser):
def p_import15(self, args):
"""
importstmt ::= filler IMPORT_NAME STORE_FAST
importstmt ::= filler IMPORT_NAME STORE_NAME
import ::= filler IMPORT_NAME STORE_FAST
import ::= filler IMPORT_NAME STORE_NAME
importfrom ::= filler IMPORT_NAME importlist
importfrom ::= filler filler IMPORT_NAME importlist POP_TOP
import_from ::= filler IMPORT_NAME importlist
import_from ::= filler filler IMPORT_NAME importlist POP_TOP
importlist ::= importlist IMPORT_FROM
importlist ::= IMPORT_FROM

View File

@@ -36,13 +36,6 @@ class Python2Parser(PythonParser):
print_nl ::= PRINT_NEWLINE
"""
def p_stmt2(self, args):
"""
exec_stmt ::= expr exprlist DUP_TOP EXEC_STMT
exec_stmt ::= expr exprlist EXEC_STMT
"""
def p_print_to(self, args):
'''
stmt ::= print_to
@@ -65,6 +58,8 @@ class Python2Parser(PythonParser):
return_if_stmts ::= _stmts return_if_stmt
return_if_stmt ::= ret_expr RETURN_END_IF
return_stmt_lambda ::= ret_expr RETURN_VALUE_LAMBDA
stmt ::= break_stmt
break_stmt ::= BREAK_LOOP
@@ -100,7 +95,7 @@ class Python2Parser(PythonParser):
kvlist ::= kvlist kv3
kv3 ::= expr expr STORE_MAP
mapexpr ::= BUILD_MAP kvlist
dict ::= BUILD_MAP kvlist
classdef ::= buildclass store
@@ -198,10 +193,10 @@ class Python2Parser(PythonParser):
store ::= expr expr STORE_SLICE+2
store ::= expr expr expr STORE_SLICE+3
augassign1 ::= expr expr inplace_op ROT_FOUR STORE_SLICE+3
augassign1 ::= expr expr inplace_op ROT_THREE STORE_SLICE+1
augassign1 ::= expr expr inplace_op ROT_THREE STORE_SLICE+2
augassign1 ::= expr expr inplace_op ROT_TWO STORE_SLICE+0
aug_assign1 ::= expr expr inplace_op ROT_FOUR STORE_SLICE+3
aug_assign1 ::= expr expr inplace_op ROT_THREE STORE_SLICE+1
aug_assign1 ::= expr expr inplace_op ROT_THREE STORE_SLICE+2
aug_assign1 ::= expr expr inplace_op ROT_TWO STORE_SLICE+0
slice0 ::= expr SLICE+0
slice0 ::= expr DUP_TOP SLICE+0
@@ -224,8 +219,8 @@ class Python2Parser(PythonParser):
Special handling for opcodes such as those that take a variable number
of arguments -- we add a new rule for each:
build_list ::= {expr}^n BUILD_LIST_n
build_list ::= {expr}^n BUILD_TUPLE_n
list ::= {expr}^n BUILD_LIST_n
list ::= {expr}^n BUILD_TUPLE_n
unpack_list ::= UNPACK_LIST {expr}^n
unpack ::= UNPACK_TUPLE {expr}^n
unpack ::= UNPACK_SEQEUENCE {expr}^n
@@ -251,7 +246,7 @@ class Python2Parser(PythonParser):
stmt ::= assign2_pypy
assign3_pypy ::= expr expr expr store store store
assign2_pypy ::= expr expr store store
list_compr ::= expr BUILD_LIST_FROM_ARG _for store list_iter
list_comp ::= expr BUILD_LIST_FROM_ARG _for store list_iter
JUMP_BACK
""", nop_func)
for i, token in enumerate(tokens):
@@ -275,7 +270,7 @@ class Python2Parser(PythonParser):
self.add_unique_rule("expr1024 ::=%s" % (' expr32' * 32),
opname_base, v, customize)
self.seen1024 = True
rule = ('build_list ::= ' + 'expr1024 '*thousands +
rule = ('list ::= ' + 'expr1024 '*thousands +
'expr32 '*thirty32s + 'expr '*(v % 32) + opname)
elif opname_base == 'BUILD_MAP':
if opname == 'BUILD_MAP_n':
@@ -283,7 +278,7 @@ class Python2Parser(PythonParser):
self.add_unique_rules([
'kvlist_n ::= kvlist_n kv3',
'kvlist_n ::=',
'mapexpr ::= BUILD_MAP_n kvlist_n',
'dict ::= BUILD_MAP_n kvlist_n',
], customize)
if self.version >= 2.7:
self.add_unique_rule(
@@ -295,7 +290,7 @@ class Python2Parser(PythonParser):
kvlist_n = "kvlist_%s" % v
self.add_unique_rules([
(kvlist_n + " ::=" + ' kv3' * v),
"mapexpr ::= %s %s" % (opname, kvlist_n)
"dict ::= %s %s" % (opname, kvlist_n)
], customize)
continue
elif opname_base == 'BUILD_SLICE':
@@ -337,6 +332,13 @@ class Python2Parser(PythonParser):
# FIXME: remove these conditions if they are not needed.
# no longer need to add a rule
continue
elif opname == 'EXEC_STMT':
self.addRule("""
exprlist ::= expr+
exec_stmt ::= expr exprlist DUP_TOP EXEC_STMT
exec_stmt ::= expr exprlist EXEC_STMT
""", nop_func)
continue
elif opname == 'JUMP_IF_NOT_DEBUG':
self.add_unique_rules([
'jmp_true_false ::= POP_JUMP_IF_TRUE',
@@ -349,10 +351,15 @@ class Python2Parser(PythonParser):
"LOAD_ASSERT expr CALL_FUNCTION_1 RAISE_VARARGS_1 COME_FROM",
], customize)
continue
elif opname == 'LOAD_LISTCOMP':
self.add_unique_rules([
"expr ::= listcomp",
], customize)
continue
elif opname == 'LOAD_SETCOMP':
self.add_unique_rules([
"expr ::= setcomp",
"setcomp ::= LOAD_SETCOMP MAKE_FUNCTION_0 expr GET_ITER CALL_FUNCTION_1"
"expr ::= set_comp",
"set_comp ::= LOAD_SETCOMP MAKE_FUNCTION_0 expr GET_ITER CALL_FUNCTION_1"
], customize)
continue
elif opname == 'LOOKUP_METHOD':
@@ -387,13 +394,13 @@ class Python2Parser(PythonParser):
prev_tok = tokens[i-1]
if prev_tok == 'LOAD_DICTCOMP':
self.add_unique_rules([
('dictcomp ::= %s load_closure LOAD_DICTCOMP %s expr'
('dict_comp ::= %s load_closure LOAD_DICTCOMP %s expr'
' GET_ITER CALL_FUNCTION_1' %
('expr '*v, opname))], customize)
elif prev_tok == 'LOAD_SETCOMP':
self.add_unique_rules([
"expr ::= setcomp",
('setcomp ::= %s load_closure LOAD_SETCOMP %s expr'
"expr ::= set_comp",
('set_comp ::= %s load_closure LOAD_SETCOMP %s expr'
' GET_ITER CALL_FUNCTION_1' %
('expr '*v, opname))
], customize)
@@ -428,8 +435,8 @@ class Python2Parser(PythonParser):
raise Exception('unknown customize token %s' % opname)
self.add_unique_rule(rule, opname_base, v, customize)
pass
self.check_reduce['augassign1'] = 'AST'
self.check_reduce['augassign2'] = 'AST'
self.check_reduce['aug_assign1'] = 'AST'
self.check_reduce['aug_assign2'] = 'AST'
self.check_reduce['_stmts'] = 'AST'
# Dead code testing...
@@ -445,7 +452,7 @@ class Python2Parser(PythonParser):
# if lhs == 'while1elsestmt':
# from trepan.api import debug; debug()
if lhs in ('augassign1', 'augassign2') and ast[0] and ast[0][0] == 'and':
if lhs in ('aug_assign1', 'aug_assign2') and ast[0] and ast[0][0] == 'and':
return True
elif lhs == '_stmts':
for i, stmt in enumerate(ast):

View File

@@ -27,7 +27,7 @@ class Python21Parser(Python22Parser):
def p_import21(self, args):
'''
import_as ::= IMPORT_NAME_CONT store
alias ::= IMPORT_NAME_CONT store
'''
class Python21ParserSingle(Python22Parser, PythonParserSingle):

View File

@@ -35,7 +35,7 @@ class Python23Parser(Python24Parser):
while1stmt ::= _while1test l_stmts_opt JUMP_BACK
COME_FROM POP_TOP POP_BLOCK COME_FROM
list_compr ::= BUILD_LIST_0 DUP_TOP LOAD_ATTR store list_iter del_stmt
list_comp ::= BUILD_LIST_0 DUP_TOP LOAD_ATTR store list_iter del_stmt
list_for ::= expr _for store list_iter JUMP_BACK come_froms POP_TOP JUMP_BACK
lc_body ::= LOAD_NAME expr CALL_FUNCTION_1 POP_TOP
@@ -48,7 +48,7 @@ class Python23Parser(Python24Parser):
expr ::= and2
and2 ::= _jump jmp_false COME_FROM expr COME_FROM
import_as ::= IMPORT_NAME load_attrs store
alias ::= IMPORT_NAME load_attrs store
load_attrs ::= LOAD_ATTR+
conditional ::= expr jmp_false expr JUMP_FORWARD expr COME_FROM

View File

@@ -26,12 +26,12 @@ class Python24Parser(Python25Parser):
# 2.5+ has two LOAD_CONSTs, one for the number '.'s in a relative import
# keep positions similar to simplify semantic actions
importstmt ::= filler LOAD_CONST import_as
importfrom ::= filler LOAD_CONST IMPORT_NAME importlist POP_TOP
importstar ::= filler LOAD_CONST IMPORT_NAME IMPORT_STAR
import ::= filler LOAD_CONST alias
import_from ::= filler LOAD_CONST IMPORT_NAME importlist POP_TOP
import_from_star ::= filler LOAD_CONST IMPORT_NAME IMPORT_STAR
importmultiple ::= filler LOAD_CONST import_as imports_cont
import_cont ::= filler LOAD_CONST import_as
importmultiple ::= filler LOAD_CONST alias imports_cont
import_cont ::= filler LOAD_CONST alias
# Python 2.5+ omits POP_TOP POP_BLOCK
while1stmt ::= SETUP_LOOP l_stmts JUMP_BACK POP_TOP POP_BLOCK COME_FROM

View File

@@ -71,9 +71,12 @@ class Python25Parser(Python26Parser):
return_if_stmts ::= return_if_stmt
return_stmt ::= ret_expr RETURN_END_IF POP_TOP
return_stmt ::= ret_expr RETURN_VALUE POP_TOP
return_stmt_lambda ::= ret_expr RETURN_VALUE_LAMBDA
setupwithas ::= DUP_TOP LOAD_ATTR ROT_TWO LOAD_ATTR CALL_FUNCTION_0 setup_finally
stmt ::= classdefdeco
stmt ::= conditional_lambda
conditional_lambda ::= expr jmp_false_then expr return_if_lambda
return_stmt_lambda LAMBDA_MARKER
""")
super(Python25Parser, self).add_custom_rules(tokens, customize)
if self.version == 2.5:

View File

@@ -195,9 +195,9 @@ class Python26Parser(Python2Parser):
list_iter ::= list_if JUMP_BACK
list_iter ::= list_if JUMP_BACK COME_FROM POP_TOP
list_compr ::= BUILD_LIST_0 DUP_TOP
list_comp ::= BUILD_LIST_0 DUP_TOP
store list_iter del_stmt
list_compr ::= BUILD_LIST_0 DUP_TOP
list_comp ::= BUILD_LIST_0 DUP_TOP
store list_iter JUMP_BACK del_stmt
lc_body ::= LOAD_NAME expr LIST_APPEND
lc_body ::= LOAD_FAST expr LIST_APPEND
@@ -248,6 +248,7 @@ class Python26Parser(Python2Parser):
compare_chained1 ::= expr DUP_TOP ROT_THREE COMPARE_OP
jmp_false compare_chained2 _come_from
return_if_lambda ::= RETURN_END_IF_LAMBDA POP_TOP
stmt ::= conditional_lambda
conditional_lambda ::= expr jmp_false_then expr return_if_lambda
return_stmt_lambda LAMBDA_MARKER
"""

View File

@@ -15,19 +15,19 @@ class Python27Parser(Python2Parser):
def p_comprehension27(self, args):
"""
list_for ::= expr _for store list_iter JUMP_BACK
list_compr ::= BUILD_LIST_0 list_iter
list_comp ::= BUILD_LIST_0 list_iter
lc_body ::= expr LIST_APPEND
stmt ::= setcomp_func
# Dictionary and set comprehensions were added in Python 2.7
expr ::= dictcomp
expr ::= dict_comp
dict_comp ::= LOAD_DICTCOMP MAKE_FUNCTION_0 expr GET_ITER CALL_FUNCTION_1
stmt ::= dictcomp_func
dictcomp_func ::= BUILD_MAP_0 LOAD_FAST FOR_ITER store
comp_iter JUMP_BACK RETURN_VALUE RETURN_LAST
dictcomp ::= LOAD_DICTCOMP MAKE_FUNCTION_0 expr GET_ITER CALL_FUNCTION_1
setcomp_func ::= BUILD_SET_0 LOAD_FAST FOR_ITER store comp_iter
JUMP_BACK RETURN_VALUE RETURN_LAST
@@ -126,6 +126,7 @@ class Python27Parser(Python2Parser):
# Common with 2.6
return_if_lambda ::= RETURN_END_IF_LAMBDA COME_FROM
stmt ::= conditional_lambda
conditional_lambda ::= expr jmp_false expr return_if_lambda
return_stmt_lambda LAMBDA_MARKER

View File

@@ -32,9 +32,6 @@ class Python3Parser(PythonParser):
# Python3 scanner adds LOAD_LISTCOMP. Python3 does list comprehension like
# other comprehensions (set, dictionary).
# listcomp is a custom Python3 rule
expr ::= listcomp
# Our "continue" heuristic - in two successive JUMP_BACKS, the first
# one may be a continue - sometimes classifies a JUMP_BACK
# as a CONTINUE. The two are kind of the same in a comprehension.
@@ -69,11 +66,12 @@ class Python3Parser(PythonParser):
def p_dictcomp3(self, args):
""""
expr ::= dictcomp
expr ::= dict_comp
stmt ::= dictcomp_func
dictcomp_func ::= BUILD_MAP_0 LOAD_FAST FOR_ITER store
comp_iter JUMP_BACK RETURN_VALUE RETURN_LAST
dictcomp ::= LOAD_DICTCOMP LOAD_CONST MAKE_FUNCTION_0 expr GET_ITER CALL_FUNCTION_1
dict_comp ::= LOAD_DICTCOMP LOAD_CONST MAKE_FUNCTION_0 expr
GET_ITER CALL_FUNCTION_1
"""
def p_grammar(self, args):
@@ -140,7 +138,6 @@ class Python3Parser(PythonParser):
testtrue ::= expr jmp_true
_ifstmts_jump ::= return_if_stmts
_ifstmts_jump ::= c_stmts_opt JUMP_FORWARD COME_FROM
_ifstmts_jump ::= c_stmts_opt COME_FROM
iflaststmt ::= testexpr c_stmts_opt JUMP_ABSOLUTE
@@ -185,7 +182,7 @@ class Python3Parser(PythonParser):
# this is nested inside a trystmt
tryfinallystmt ::= SETUP_FINALLY suite_stmts_opt
POP_BLOCK LOAD_CONST
come_from_or_finally suite_stmts_opt END_FINALLY
COME_FROM_FINALLY suite_stmts_opt END_FINALLY
tryelsestmt ::= SETUP_EXCEPT suite_stmts_opt POP_BLOCK
try_middle else_suite come_from_except_clauses
@@ -233,7 +230,7 @@ class Python3Parser(PythonParser):
except_suite_finalize ::= SETUP_FINALLY c_stmts_opt except_var_finalize
END_FINALLY _jump
except_var_finalize ::= POP_BLOCK POP_EXCEPT LOAD_CONST come_from_or_finally
except_var_finalize ::= POP_BLOCK POP_EXCEPT LOAD_CONST COME_FROM_FINALLY
LOAD_CONST store del_stmt
except_suite ::= return_stmts
@@ -275,19 +272,16 @@ class Python3Parser(PythonParser):
try_middle ::= JUMP_FORWARD COME_FROM_EXCEPT except_stmts
END_FINALLY COME_FROM_EXCEPT_CLAUSE
for_block ::= l_stmts_opt opt_come_from_loop JUMP_BACK
for_block ::= l_stmts_opt come_from_loops JUMP_BACK
for_block ::= l_stmts
iflaststmtl ::= testexpr c_stmts_opt
expr ::= conditionalTrue
conditionalTrue ::= expr JUMP_FORWARD expr COME_FROM
"""
def p_def_annotations3(self, args):
"""
# Annotated functions
stmt ::= funcdef_annotate
funcdef_annotate ::= mkfunc_annotate store
stmt ::= function_def_annotate
function_def_annotate ::= mkfunc_annotate store
mkfuncdeco0 ::= mkfunc_annotate
@@ -310,16 +304,8 @@ class Python3Parser(PythonParser):
opt_come_from_except ::= come_from_except_clauses
come_froms ::= COME_FROM*
come_from_except_clauses ::= COME_FROM_EXCEPT_CLAUSE+
opt_come_from_loop ::= opt_come_from_loop COME_FROM_LOOP
opt_come_from_loop ::= opt_come_from_loop COME_FROM_LOOP
opt_come_from_loop ::=
come_from_or_finally ::= COME_FROM_FINALLY
come_from_or_finally ::= COME_FROM
come_from_loops ::= COME_FROM_LOOP*
"""
def p_jump3(self, args):
@@ -354,7 +340,7 @@ class Python3Parser(PythonParser):
def p_loop_stmt3(self, args):
"""
forstmt ::= SETUP_LOOP expr _for store for_block POP_BLOCK
opt_come_from_loop
come_from_loops
forelsestmt ::= SETUP_LOOP expr _for store for_block POP_BLOCK else_suite
COME_FROM_LOOP
@@ -371,11 +357,6 @@ class Python3Parser(PythonParser):
whilestmt ::= SETUP_LOOP testexpr l_stmts_opt JUMP_BACK POP_BLOCK
COME_FROM_LOOP
# The JUMP_ABSOLUTE below comes from escaping an "if" block which surrounds
# the while. This is messy
whilestmt ::= SETUP_LOOP testexpr l_stmts_opt JUMP_BACK POP_BLOCK
JUMP_ABSOLUTE COME_FROM_LOOP
whilestmt ::= SETUP_LOOP testexpr return_stmts POP_BLOCK
COME_FROM_LOOP
@@ -412,7 +393,7 @@ class Python3Parser(PythonParser):
load_genexpr ::= BUILD_TUPLE_1 LOAD_GENEXPR LOAD_CONST
# Is there something general going on here?
dictcomp ::= load_closure LOAD_DICTCOMP LOAD_CONST MAKE_CLOSURE_0 expr GET_ITER CALL_FUNCTION_1
dict_comp ::= load_closure LOAD_DICTCOMP LOAD_CONST MAKE_CLOSURE_0 expr GET_ITER CALL_FUNCTION_1
'''
def p_expr3(self, args):
@@ -423,12 +404,6 @@ class Python3Parser(PythonParser):
# a JUMP_FORWARD to another JUMP_FORWARD can get turned into
# a JUMP_ABSOLUTE with no COME_FROM
conditional ::= expr jmp_false expr jump_absolute_else expr
return_if_lambda ::= RETURN_END_IF_LAMBDA
conditional_lambda ::= expr jmp_false return_stmt_lambda
return_stmt_lambda LAMBDA_MARKER
conditional_lambda ::= expr jmp_false expr return_if_lambda
return_stmt_lambda LAMBDA_MARKER
"""
@staticmethod
@@ -572,14 +547,14 @@ class Python3Parser(PythonParser):
# Even the below say _list, in the semantic rules we
# disambiguate tuples, and sets from lists
build_list ::= {expr}^n BUILD_LIST_n
build_list ::= {expr}^n BUILD_TUPLE_n
build_list ::= {expr}^n BUILD_LIST_UNPACK_n
build_list ::= {expr}^n BUILD_TUPLE_UNPACK_n
list ::= {expr}^n BUILD_LIST_n
list ::= {expr}^n BUILD_TUPLE_n
list ::= {expr}^n BUILD_LIST_UNPACK_n
list ::= {expr}^n BUILD_TUPLE_UNPACK_n
# FIXME:
build_list ::= {expr}^n BUILD_SET_n
build_list ::= {expr}^n BUILD_SET_UNPACK_n
list ::= {expr}^n BUILD_SET_n
list ::= {expr}^n BUILD_SET_UNPACK_n
should be
build_set ::= {expr}^n BUILD_SET_n
build_set ::= {expr}^n BUILD_SET_UNPACK_n
@@ -594,7 +569,7 @@ class Python3Parser(PythonParser):
# Is there something more general than this? adding pos_arg?
# Is there something corresponding using MAKE_CLOSURE?
dictcomp ::= LOAD_DICTCOMP [LOAD_CONST] MAKE_FUNCTION_0 expr
dict_comp ::= LOAD_DICTCOMP [LOAD_CONST] MAKE_FUNCTION_0 expr
GET_ITER CALL_FUNCTION_1
generator_exp ::= {pos_arg}^n load_genexpr [LOAD_CONST] MAKE_FUNCTION_n expr
@@ -609,12 +584,12 @@ class Python3Parser(PythonParser):
# Is there something more general than this? adding pos_arg?
# Is there something corresponding using MAKE_CLOSURE?
For example:
# setcomp ::= {pos_arg}^n LOAD_SETCOMP [LOAD_CONST] MAKE_CLOSURE_n
# set_comp ::= {pos_arg}^n LOAD_SETCOMP [LOAD_CONST] MAKE_CLOSURE_n
GET_ITER CALL_FUNCTION_1
setcomp ::= LOAD_SETCOMP [LOAD_CONST] MAKE_FUNCTION_0 expr
set_comp ::= LOAD_SETCOMP [LOAD_CONST] MAKE_FUNCTION_0 expr
GET_ITER CALL_FUNCTION_1
setcomp ::= {pos_arg}^n load_closure LOAD_SETCOMP [LOAD_CONST]
set_comp ::= {pos_arg}^n load_closure LOAD_SETCOMP [LOAD_CONST]
MAKE_CLOSURE_n expr GET_ITER CALL_FUNCTION_1
mkfunc ::= {pos_arg}^n load_closure [LOAD_CONST] MAKE_FUNCTION_n
@@ -626,22 +601,25 @@ class Python3Parser(PythonParser):
load_attr ::= expr LOOKUP_METHOD
call ::= expr CALL_METHOD
"""
is_pypy = False
seen_LOAD_BUILD_CLASS = False
seen_LOAD_DICTCOMP = False
seen_LOAD_LISTCOMP = False
seen_LOAD_SETCOMP = False
seen_classdeco_end = False
seen_GET_AWAITABLE_YIELD_FROM = False
# Loop over instructions adding custom grammar rules based on
# a specific instruction seen.
if 'PyPy' in customize:
is_pypy = True
self.addRule("""
stmt ::= assign3_pypy
stmt ::= assign2_pypy
assign3_pypy ::= expr expr expr store store store
assign2_pypy ::= expr expr store store
return_if_lambda ::= RETURN_END_IF_LAMBDA
return_stmt_lambda ::= ret_expr RETURN_VALUE_LAMBDA
stmt ::= conditional_lambda
conditional_lambda ::= expr jmp_false expr return_if_lambda
return_stmt_lambda LAMBDA_MARKER
""", nop_func)
has_get_iter_call_function1 = False
@@ -666,7 +644,7 @@ class Python3Parser(PythonParser):
if opname_base == 'BUILD_CONST_KEY_MAP':
# This is in 3.6+
kvlist_n = 'expr ' * (token.attr)
rule = "mapexpr ::= %sLOAD_CONST %s" % (kvlist_n, opname)
rule = "dict ::= %sLOAD_CONST %s" % (kvlist_n, opname)
self.add_unique_rule(rule, opname, token.attr, customize)
elif opname.startswith('BUILD_LIST_UNPACK'):
v = token.attr
@@ -689,27 +667,27 @@ class Python3Parser(PythonParser):
self.add_unique_rule(rule, 'kvlist_n', 0, customize)
rule = 'kvlist_n ::='
self.add_unique_rule(rule, 'kvlist_n', 1, customize)
rule = "mapexpr ::= BUILD_MAP_n kvlist_n"
rule = "dict ::= BUILD_MAP_n kvlist_n"
elif self.version >= 3.5:
if opname != 'BUILD_MAP_WITH_CALL':
if opname == 'BUILD_MAP_UNPACK':
rule = kvlist_n + ' ::= ' + 'expr ' * (token.attr*2)
self.add_unique_rule(rule, opname, token.attr, customize)
rule = 'dict ::= ' + 'expr ' * (token.attr*2)
rule = 'dict_entry ::= ' + 'expr ' * (token.attr*2)
self.add_unique_rule(rule, opname, token.attr, customize)
rule = 'mapexpr ::= ' + 'dict ' * token.attr
rule = 'dict ::= ' + 'dict_entry ' * token.attr
self.add_unique_rule(rule, opname, token.attr, customize)
rule = ('unmap_dict ::= ' +
('mapexpr ' * token.attr) +
('dict ' * token.attr) +
'BUILD_MAP_UNPACK')
else:
rule = kvlist_n + ' ::= ' + 'expr ' * (token.attr*2)
self.add_unique_rule(rule, opname, token.attr, customize)
rule = "mapexpr ::= %s %s" % (kvlist_n, opname)
rule = "dict ::= %s %s" % (kvlist_n, opname)
else:
rule = kvlist_n + ' ::= ' + 'expr expr STORE_MAP ' * token.attr
self.add_unique_rule(rule, opname, token.attr, customize)
rule = "mapexpr ::= %s %s" % (opname, kvlist_n)
rule = "dict ::= %s %s" % (opname, kvlist_n)
self.add_unique_rule(rule, opname, token.attr, customize)
elif opname.startswith('BUILD_MAP_UNPACK_WITH_CALL'):
v = token.attr
@@ -723,7 +701,7 @@ class Python3Parser(PythonParser):
is_LOAD_CLOSURE = False
if opname_base == 'BUILD_TUPLE':
# If is part of a "load_closure", then it is not part of a
# "build_list".
# "list".
is_LOAD_CLOSURE = True
for j in range(v):
if tokens[i-j-1].kind != 'LOAD_CLOSURE':
@@ -733,7 +711,7 @@ class Python3Parser(PythonParser):
rule = ('load_closure ::= %s%s' % (('LOAD_CLOSURE ' * v), opname))
self.add_unique_rule(rule, opname, token.attr, customize)
if not is_LOAD_CLOSURE or v == 0:
rule = ('build_list ::= ' + 'expr1024 ' * int(v//1024) +
rule = ('list ::= ' + 'expr1024 ' * int(v//1024) +
'expr32 ' * int((v//32) % 32) +
'expr ' * (v % 32) + opname)
self.add_unique_rule(rule, opname, token.attr, customize)
@@ -804,21 +782,19 @@ class Python3Parser(PythonParser):
opname, token.attr, customize)
continue
elif opname == 'LOAD_DICTCOMP':
seen_LOAD_DICTCOMP = True
if has_get_iter_call_function1:
rule_pat = ("dictcomp ::= LOAD_DICTCOMP %sMAKE_FUNCTION_0 expr "
rule_pat = ("dict_comp ::= LOAD_DICTCOMP %sMAKE_FUNCTION_0 expr "
"GET_ITER CALL_FUNCTION_1")
self.add_make_function_rule(rule_pat, opname, token.attr, customize)
# listcomp is a custom Python3 rule
elif opname == 'LOAD_LISTCOMP':
seen_LOAD_LISTCOMP = True
continue
self.add_unique_rule("expr ::= listcomp", opname, token.attr, customize)
elif opname == 'LOAD_SETCOMP':
seen_LOAD_SETCOMP = True
# Should this be generalized and put under MAKE_FUNCTION?
if has_get_iter_call_function1:
self.add_unique_rule("expr ::= setcomp",
self.add_unique_rule("expr ::= set_comp",
opname, token.attr, customize)
rule_pat = ("setcomp ::= LOAD_SETCOMP %sMAKE_FUNCTION_0 expr "
rule_pat = ("set_comp ::= LOAD_SETCOMP %sMAKE_FUNCTION_0 expr "
"GET_ITER CALL_FUNCTION_1")
self.add_make_function_rule(rule_pat, opname, token.attr, customize)
elif opname == 'LOOKUP_METHOD':
@@ -831,6 +807,9 @@ class Python3Parser(PythonParser):
# Note: this probably doesn't handle kwargs proprerly
args_pos, args_kw, annotate_args = token.attr
# FIXME: Fold test into add_make_function_rule
j = 1 if self.version < 3.3 else 2
if is_pypy or (i >= j and tokens[i-j] == 'LOAD_LAMBDA'):
rule_pat = ('mklambda ::= %sload_closure LOAD_LAMBDA %%s%s' %
('pos_arg '* args_pos, opname))
self.add_make_function_rule(rule_pat, opname, token.attr, customize)
@@ -839,20 +818,26 @@ class Python3Parser(PythonParser):
rule_pat = ("generator_exp ::= %sload_closure load_genexpr %%s%s expr "
"GET_ITER CALL_FUNCTION_1" % ('pos_arg '* args_pos, opname))
self.add_make_function_rule(rule_pat, opname, token.attr, customize)
if seen_LOAD_LISTCOMP:
if has_get_iter_call_function1:
if (is_pypy or (i >= j and tokens[i-j] == 'LOAD_LISTCOMP')):
# In the tokens we saw:
# LOAD_LISTCOMP LOAD_CONST MAKE_FUNCTION (>= 3.3) or
# LOAD_LISTCOMP MAKE_FUNCTION (< 3.3) or
# and have GET_ITER CALL_FUNCTION_1
# Todo: For Pypy we need to modify this slightly
rule_pat = ('listcomp ::= %sload_closure LOAD_LISTCOMP %%s%s expr '
'GET_ITER CALL_FUNCTION_1' % ('pos_arg ' * args_pos, opname))
self.add_make_function_rule(rule_pat, opname, token.attr, customize)
if seen_LOAD_SETCOMP:
rule_pat = ('setcomp ::= %sload_closure LOAD_SETCOMP %%s%s expr '
if (is_pypy or (i >= j and tokens[i-j] == 'LOAD_SETCOMP')):
rule_pat = ('set_comp ::= %sload_closure LOAD_SETCOMP %%s%s expr '
'GET_ITER CALL_FUNCTION_1' % ('pos_arg ' * args_pos, opname))
self.add_make_function_rule(rule_pat, opname, token.attr, customize)
if seen_LOAD_DICTCOMP:
self.add_unique_rule('dictcomp ::= %sload_closure LOAD_DICTCOMP %s '
if (is_pypy or (i >= j and tokens[i-j] == 'LOAD_DICTCOMP')):
self.add_unique_rule('dict_comp ::= %sload_closure LOAD_DICTCOMP %s '
'expr GET_ITER CALL_FUNCTION_1' %
('pos_arg '* args_pos, opname),
opname, token.attr, customize)
# FIXME: kwarg processing is missing here.
# Note order of kwargs and pos args changed between 3.3-3.4
if self.version <= 3.2:
@@ -894,37 +879,47 @@ class Python3Parser(PythonParser):
rule_pat = ("generator_exp ::= %sload_closure load_genexpr %%s%s expr "
"GET_ITER CALL_FUNCTION_1" % ('pos_arg '* args_pos, opname))
self.add_make_function_rule(rule_pat, opname, token.attr, customize)
if is_pypy or (i >= 2 and tokens[i-2] == 'LOAD_LISTCOMP'):
rule_pat = ("listcomp ::= %sLOAD_LISTCOMP %%s%s expr "
"GET_ITER CALL_FUNCTION_1" % ('expr ' * args_pos, opname))
self.add_make_function_rule(rule_pat, opname, token.attr, customize)
if is_pypy or (i >= 2 and tokens[i-2] == 'LOAD_LAMBDA'):
rule_pat = ('mklambda ::= %s%sLOAD_LAMBDA %%s%s' %
(('pos_arg '* args_pos),
('kwarg '* args_kw),
opname))
self.add_make_function_rule(rule_pat, opname, token.attr, customize)
if seen_LOAD_LISTCOMP and has_get_iter_call_function1:
rule_pat = ("listcomp ::= %sLOAD_LISTCOMP %%s%s expr "
"GET_ITER CALL_FUNCTION_1" % ('expr ' * args_pos, opname))
self.add_make_function_rule(rule_pat, opname, token.attr, customize)
continue
if self.version < 3.6:
args_pos, args_kw, annotate_args = token.attr
else:
args_pos, args_kw, annotate_args, closure = token.attr
j = 1 if self.version < 3.3 else 2
if has_get_iter_call_function1:
rule_pat = ("generator_exp ::= %sload_genexpr %%s%s expr "
"GET_ITER CALL_FUNCTION_1" % ('pos_arg '* args_pos, opname))
self.add_make_function_rule(rule_pat, opname, token.attr, customize)
if is_pypy or (i >= j and tokens[i-j] == 'LOAD_LISTCOMP'):
# In the tokens we saw:
# LOAD_LISTCOMP LOAD_CONST MAKE_FUNCTION (>= 3.3) or
# LOAD_LISTCOMP MAKE_FUNCTION (< 3.3) or
# and have GET_ITER CALL_FUNCTION_1
# Todo: For Pypy we need to modify this slightly
rule_pat = ("listcomp ::= %sLOAD_LISTCOMP %%s%s expr "
"GET_ITER CALL_FUNCTION_1" % ('expr ' * args_pos, opname))
self.add_make_function_rule(rule_pat, opname, token.attr, customize)
# FIXME: Fold test into add_make_function_rule
if is_pypy or (i >= j and tokens[i-j] == 'LOAD_LAMBDA'):
rule_pat = ('mklambda ::= %s%sLOAD_LAMBDA %%s%s' %
(('pos_arg '* args_pos),
('kwarg '* args_kw),
opname))
self.add_make_function_rule(rule_pat, opname, token.attr, customize)
if seen_LOAD_LISTCOMP and has_get_iter_call_function1:
rule_pat = ("listcomp ::= %sLOAD_LISTCOMP %%s%s expr "
"GET_ITER CALL_FUNCTION_1" % ('expr ' * args_pos, opname))
self.add_make_function_rule(rule_pat, opname, token.attr, customize)
if self.version == 3.3:
# positional args after keyword args
rule = ('mkfunc ::= kwargs %s%s %s' %
@@ -978,8 +973,8 @@ class Python3Parser(PythonParser):
self.add_unique_rule(rule, opname, token.attr, customize)
elif opname_base == 'UNPACK_LIST':
rule = 'unpack_list ::= ' + opname + ' store' * token.attr
self.check_reduce['augassign1'] = 'AST'
self.check_reduce['augassign2'] = 'AST'
self.check_reduce['aug_assign1'] = 'AST'
self.check_reduce['aug_assign2'] = 'AST'
self.check_reduce['while1stmt'] = 'noAST'
self.check_reduce['annotate_tuple'] = 'noAST'
self.check_reduce['kwarg'] = 'noAST'
@@ -989,7 +984,7 @@ class Python3Parser(PythonParser):
def reduce_is_invalid(self, rule, ast, tokens, first, last):
lhs = rule[0]
if lhs in ('augassign1', 'augassign2') and ast[0][0] == 'and':
if lhs in ('aug_assign1', 'aug_assign2') and ast[0][0] == 'and':
return True
elif lhs == 'annotate_tuple':
return not isinstance(tokens[first].attr, tuple)

View File

@@ -22,7 +22,6 @@ class Python32Parser(Python3Parser):
# Python < 3.5 no POP BLOCK
whileTruestmt ::= SETUP_LOOP l_stmts_opt JUMP_BACK COME_FROM_LOOP
whileTruestmt ::= SETUP_LOOP return_stmts COME_FROM_LOOP
# Python 3.5+ has jump optimization to remove the redundant
# jump_excepts. But in 3.3 we need them added
@@ -49,10 +48,8 @@ class Python32Parser(Python3Parser):
stmt ::= del_deref_stmt
del_deref_stmt ::= DELETE_DEREF
list_compr ::= BUILD_LIST_0 list_iter
list_comp ::= BUILD_LIST_0 list_iter
lc_body ::= expr LIST_APPEND
kvlist ::= kvlist kv3
kv3 ::= expr expr STORE_MAP
"""
pass
@@ -65,9 +62,15 @@ class Python32Parser(Python3Parser):
pass
def add_custom_rules(self, tokens, customize):
# self.remove_rules("""
# compare_chained2 ::= expr COMPARE_OP RETURN_VALUE
# """)
self.remove_rules("""
try_middle ::= JUMP_FORWARD COME_FROM except_stmts END_FINALLY COME_FROM
try_middle ::= JUMP_FORWARD COME_FROM except_stmts END_FINALLY COME_FROM_EXCEPT
try_middle ::= JUMP_FORWARD COME_FROM_EXCEPT except_stmts END_FINALLY COME_FROM_EXCEPT_CLAUSE
try_middle ::= jmp_abs COME_FROM except_stmts END_FINALLY
tryelsestmt ::= SETUP_EXCEPT suite_stmts_opt POP_BLOCK try_middle else_suite come_from_except_clauses
whileTruestmt ::= SETUP_LOOP l_stmts_opt JUMP_BACK NOP COME_FROM_LOOP
whileTruestmt ::= SETUP_LOOP l_stmts_opt JUMP_BACK POP_BLOCK NOP COME_FROM_LOOP
""")
super(Python32Parser, self).add_custom_rules(tokens, customize)
for i, token in enumerate(tokens):
opname = token.kind

View File

@@ -17,26 +17,19 @@ class Python33Parser(Python32Parser):
# We do the grammar hackery below for semantics
# actions that want c_stmts_opt at index 1
whileTruestmt ::= SETUP_LOOP l_stmts JUMP_ABSOLUTE
JUMP_BACK COME_FROM_LOOP
# Python 3.5+ has jump optimization to remove the redundant
# jump_excepts. But in 3.3 we need them added
trystmt ::= SETUP_EXCEPT suite_stmts_opt POP_BLOCK
try_middle
jump_excepts come_from_except_clauses
mapexpr ::= BUILD_MAP kvlist
"""
def add_custom_rules(self, tokens, customize):
self.remove_rules("""
# 3.3+ adds POP_BLOCKS
whileTruestmt ::= SETUP_LOOP l_stmts JUMP_ABSOLUTE JUMP_BACK COME_FROM_LOOP
whileTruestmt ::= SETUP_LOOP l_stmts_opt JUMP_BACK NOP COME_FROM_LOOP
whileTruestmt ::= SETUP_LOOP l_stmts_opt JUMP_BACK POP_BLOCK NOP COME_FROM_LOOP
whilestmt ::= SETUP_LOOP testexpr l_stmts_opt JUMP_BACK
POP_BLOCK JUMP_ABSOLUTE COME_FROM_LOOP
""")
super(Python33Parser, self).add_custom_rules(tokens, customize)
return

View File

@@ -3,7 +3,7 @@
spark grammar differences over Python 3.4 for Python 3.5.
"""
from uncompyle6.parser import PythonParserSingle
from uncompyle6.parser import PythonParserSingle, nop_func
from spark_parser import DEFAULT_DEBUG as PARSER_DEFAULT_DEBUG
from uncompyle6.parsers.parse34 import Python34Parser
@@ -32,22 +32,8 @@ class Python35Parser(Python34Parser):
stmt ::= await_stmt
await_stmt ::= await_expr POP_TOP
expr ::= unmap_dict
expr ::= unmapexpr
unmap_dict ::= dictcomp BUILD_MAP_UNPACK
unmap_dict ::= kv_lists BUILD_MAP_UNPACK
kv_lists ::= kv_list kv_lists
kv_lists ::= kv_list
# Python 3.5+ has WITH_CLEANUP_START/FINISH
withstmt ::= expr
SETUP_WITH exprlist suite_stmts_opt
POP_BLOCK LOAD_CONST COME_FROM_WITH
WITH_CLEANUP_START WITH_CLEANUP_FINISH END_FINALLY
withstmt ::= expr
SETUP_WITH POP_TOP suite_stmts_opt
POP_BLOCK LOAD_CONST COME_FROM_WITH
@@ -90,7 +76,7 @@ class Python35Parser(Python34Parser):
POP_TOP POP_TOP POP_TOP POP_EXCEPT POP_BLOCK
JUMP_ABSOLUTE END_FINALLY COME_FROM
for_block POP_BLOCK JUMP_ABSOLUTE
opt_come_from_loop
come_from_loops
async_for_stmt ::= SETUP_LOOP expr
GET_AITER
@@ -102,7 +88,7 @@ class Python35Parser(Python34Parser):
POP_TOP POP_TOP POP_TOP POP_EXCEPT POP_BLOCK
JUMP_ABSOLUTE END_FINALLY JUMP_BACK
passstmt POP_BLOCK JUMP_ABSOLUTE
opt_come_from_loop
come_from_loops
stmt ::= async_forelse_stmt
async_forelse_stmt ::= SETUP_LOOP expr
@@ -127,7 +113,6 @@ class Python35Parser(Python34Parser):
return_if_stmt ::= ret_expr RETURN_END_IF POP_BLOCK
ifelsestmtc ::= testexpr c_stmts_opt JUMP_FORWARD else_suitec
ifelsestmtc ::= testexpr c_stmts_opt jf_else else_suitec
# ifstmt ::= testexpr c_stmts_opt
@@ -141,24 +126,37 @@ class Python35Parser(Python34Parser):
def add_custom_rules(self, tokens, customize):
self.remove_rules("""
# FIXME: should this be in 3.3?
whileTruestmt ::= SETUP_LOOP return_stmts COME_FROM_LOOP
yield_from ::= expr GET_ITER LOAD_CONST YIELD_FROM
yield_from ::= expr expr YIELD_FROM
withstmt ::= expr SETUP_WITH POP_TOP suite_stmts_opt
POP_BLOCK LOAD_CONST COME_FROM_WITH
WITH_CLEANUP END_FINALLY
withasstmt ::= expr SETUP_WITH store suite_stmts_opt
POP_BLOCK LOAD_CONST COME_FROM_WITH
WITH_CLEANUP END_FINALLY
""")
super(Python35Parser, self).add_custom_rules(tokens, customize)
for i, token in enumerate(tokens):
opname = token.kind
if opname == 'BUILD_MAP_UNPACK_WITH_CALL':
self.addRule("expr ::= unmapexpr", nop_func)
nargs = token.attr % 256
map_unpack_n = "map_unpack_%s" % nargs
rule = map_unpack_n + ' ::= ' + 'expr ' * (nargs)
self.add_unique_rule(rule, opname, token.attr, customize)
self.addRule(rule, nop_func)
rule = "unmapexpr ::= %s %s" % (map_unpack_n, opname)
self.add_unique_rule(rule, opname, token.attr, customize)
self.addRule(rule, nop_func)
call_token = tokens[i+1]
if self.version == 3.5:
rule = 'call ::= expr unmapexpr ' + call_token.kind
self.add_unique_rule(rule, opname, token.attr, customize)
self.addRule(rule, nop_func)
pass
elif opname == 'BUILD_MAP_UNPACK':
self.addRule("""
expr ::= unmap_dict
unmap_dict ::= dict_comp BUILD_MAP_UNPACK
""", nop_func)
pass
return

View File

@@ -19,8 +19,12 @@ class Python36Parser(Python35Parser):
# 3.6 redoes how return_closure works
return_closure ::= LOAD_CLOSURE DUP_TOP STORE_NAME RETURN_VALUE RETURN_LAST
fstring_multi ::= fstring_expr_or_strs BUILD_STRING
fstring_expr_or_strs ::= fstring_expr_or_str+
stmt ::= conditional_lambda
conditional_lambda ::= expr jmp_false expr return_if_lambda
return_stmt_lambda LAMBDA_MARKER
return_stmt_lambda ::= ret_expr RETURN_VALUE_LAMBDA
return_if_lambda ::= RETURN_END_IF_LAMBDA
func_args36 ::= expr BUILD_TUPLE_0
call ::= func_args36 unmapexpr CALL_FUNCTION_EX
@@ -92,6 +96,8 @@ class Python36Parser(Python35Parser):
fstring_expr_or_str ::= str
expr ::= fstring_multi
fstring_multi ::= fstring_expr_or_strs BUILD_STRING
fstring_expr_or_strs ::= fstring_expr_or_str+
fstring_multi ::= %s BUILD_STRING
%s ::= %sBUILD_STRING
""" % (fstring_expr_or_str_n, fstring_expr_or_str_n, "fstring_expr_or_str " * v)

View File

@@ -254,17 +254,15 @@ class Scanner2(Scanner):
else:
op_name = '%s_%d' % (op_name, oparg)
customize[op_name] = oparg
elif self.is_pypy and op_name in ('LOOKUP_METHOD',
'JUMP_IF_NOT_DEBUG',
'SETUP_EXCEPT',
'SETUP_FINALLY'):
elif self.is_pypy and op_name in frozenset(
"""LOOKUP_METHOD JUMP_IF_NOT_DEBUG SETUP_EXCEPT SETUP_FINALLY""".split()):
# The value in the dict is in special cases in semantic actions, such
# as CALL_FUNCTION. The value is not used in these cases, so we put
# in arbitrary value 0.
customize[op_name] = 0
elif op == self.opc.CONTINUE_LOOP:
customize[op_name] = 0
elif op_name == 'LOAD_SETCOMP':
elif op_name in """
CONTINUE_LOOP EXEC_STMT LOAD_LISTCOMP LOAD_SETCOMP
""".split():
customize[op_name] = 0
elif op == self.opc.JUMP_ABSOLUTE:
# Further classify JUMP_ABSOLUTE into backward jumps
@@ -1027,33 +1025,45 @@ class Scanner2(Scanner):
pass
pass
# FIXME: All the < 2.7 conditions are is horrible. We need a better way.
# FIXME FIXME FIXME
# All the conditions are horrible, and I am not sure I
# undestand fully what's going l
# WeR REALLY REALLY need a better way to handle control flow
# Expecially for < 2.7
if label is not None and label != -1:
# In Python < 2.7, the POP_TOP in:
# RETURN_VALUE, POP_TOP
# does now start a new statement
# Otherwise, we have want to add a "COME_FROM"
if not (self.version < 2.7 and
code[label] == self.opc.POP_TOP and
if self.version == 2.7:
# FIXME: rocky: I think we need something like this...
if label in self.setup_loops:
source = self.setup_loops[label]
else:
source = offset
targets[label] = targets.get(label, []) + [source]
elif not (code[label] == self.opc.POP_TOP and
code[self.prev[label]] == self.opc.RETURN_VALUE):
# In Python < 2.7, don't add a COME_FROM, for:
# JUMP_FORWARD, END_FINALLY
# RETURN_VALUE POP_TOP .. END_FINALLY
# or:
# JUMP_FORWARD, POP_TOP, END_FINALLY
if not (self.version < 2.7 and op == self.opc.JUMP_FORWARD
and ((code[offset+3] == self.opc.END_FINALLY)
or (code[offset+3] == self.opc.POP_TOP
and code[offset+4] == self.opc.END_FINALLY))):
# RETURN_VALUE POP_TOP .. POP_TOP END_FINALLY
skip_come_from = False
if self.version <= 2.5:
skip_come_from = (code[offset+3] == self.opc.END_FINALLY or
(code[offset+3] == self.opc.POP_TOP
and code[offset+4] == self.opc.END_FINALLY))
else:
skip_come_from = (code[offset+3] == self.opc.END_FINALLY or
(op != self.opc.JUMP_FORWARD
and code[offset+3] == self.opc.POP_TOP
and code[offset+4] == self.opc.END_FINALLY))
if not skip_come_from:
# FIXME: rocky: I think we need something like this...
if offset not in set(self.ignore_if) or self.version == 2.7:
if offset not in set(self.ignore_if):
if label in self.setup_loops:
source = self.setup_loops[label]
else:
source = offset
targets[label] = targets.get(label, []) + [source]
pass
pass
pass
pass
elif op == self.opc.END_FINALLY and offset in self.fixed_jumps and self.version == 2.7:

View File

@@ -240,6 +240,10 @@ class Scanner26(scan.Scanner2):
customize[op_name] = oparg
elif self.version > 2.0 and op == self.opc.CONTINUE_LOOP:
customize[op_name] = 0
elif op_name in """
CONTINUE_LOOP EXEC_STMT LOAD_LISTCOMP LOAD_SETCOMP
""".split():
customize[op_name] = 0
elif op == self.opc.JUMP_ABSOLUTE:
# Further classify JUMP_ABSOLUTE into backward jumps
# which are used in loops, and "CONTINUE" jumps which

View File

@@ -14,7 +14,7 @@ def checker(ast, in_loop, errors):
in_loop = in_loop or ast.kind in ('while1stmt', 'whileTruestmt',
'whilestmt', 'whileelsestmt', 'while1elsestmt',
'for_block')
if ast.kind in ('augassign1', 'augassign2') and ast[0][0] == 'and':
if ast.kind in ('aug_assign1', 'aug_assign2') and ast[0][0] == 'and':
text = str(ast)
error_text = '\n# improper augmented assigment (e.g. +=, *=, ...):\n#\t' + '\n# '.join(text.split("\n")) + '\n'
errors.append(error_text)

View File

@@ -156,7 +156,6 @@ TABLE_DIRECT = {
'unpack_list': ( '[%C]', (1, maxint, ', ') ),
'build_tuple2': ( '%P', (0, -1, ', ', 100) ),
# 'list_compr': ( '[ %c ]', -2), # handled by n_list_compr
'list_iter': ( '%c', 0 ),
'list_for': ( ' for %c in %c%c', 2, 0, 3 ),
'list_if': ( ' if %c%c', 0, 2 ),
@@ -176,9 +175,9 @@ TABLE_DIRECT = {
# The 2nd parameter should have a = suffix.
# There is a rule with a 4th parameter "store"
# which we don't use here.
'augassign1': ( '%|%c %c %c\n', 0, 2, 1),
'aug_assign1': ( '%|%c %c %c\n', 0, 2, 1),
'augassign2': ( '%|%c.%[2]{pattr} %c %c\n', 0, -3, -4 ),
'aug_assign2': ( '%|%c.%[2]{pattr} %c %c\n', 0, -3, -4 ),
'designList': ( '%c = %c', 0, -1 ),
'and': ( '%c and %c', 0, 2 ),
'ret_and': ( '%c and %c', 0, 2 ),
@@ -197,7 +196,7 @@ TABLE_DIRECT = {
'compare_chained1': ( '%[3]{pattr} %p %p', (0, 19), (-2, 19)),
'compare_chained2': ( '%[1]{pattr} %p', (0, 19)),
# 'classdef': (), # handled by n_classdef()
'funcdef': ( '\n\n%|def %c\n', -2), # -2 to handle closures
'function_def': ( '\n\n%|def %c\n', -2), # -2 to handle closures
'funcdefdeco': ( '\n\n%c', 0),
'mkfuncdeco': ( '%|@%c\n%c', 0, 1),
'mkfuncdeco0': ( '%|def %c\n', 0),
@@ -273,12 +272,11 @@ TABLE_DIRECT = {
'STORE_FAST': ( '%{pattr}', ),
'kv': ( '%c: %c', 3, 1 ),
'kv2': ( '%c: %c', 1, 2 ),
'mapexpr': ( '{%[1]C}', (0, maxint, ', ') ),
'importstmt': ( '%|import %c\n', 2),
'import': ( '%|import %c\n', 2),
'importlist': ( '%C', (0, maxint, ', ') ),
'importfrom': ( '%|from %[2]{pattr} import %c\n',
'import_from': ( '%|from %[2]{pattr} import %c\n',
(3, 'importlist') ),
'importstar': ( '%|from %[2]{pattr} import *\n', ),
'import_from_star': ( '%|from %[2]{pattr} import *\n', ),
}
@@ -299,12 +297,12 @@ MAP = {
# or https://docs.python.org/3/reference/expressions.html
# for a list.
PRECEDENCE = {
'build_list': 0,
'mapexpr': 0,
'list': 0,
'dict': 0,
'unary_convert': 0,
'dictcomp': 0,
'setcomp': 0,
'list_compr': 0,
'dict_comp': 0,
'set_comp': 0,
'list_comp': 0,
'generator_exp': 0,
'load_attr': 2,

View File

@@ -18,7 +18,7 @@ We add some format specifiers here not used in pysource
from src to dest.
For example in:
'importstmt': ( '%|import %c%x\n', 2, (2,(0,1)), ),
'import': ( '%|import %c%x\n', 2, (2,(0,1)), ),
node 2 range information, it in %c, is copied to nodes 0 and 1.
@@ -91,7 +91,7 @@ TABLE_DIRECT_FRAGMENT = {
'continue_stmt': ( '%|%rcontinue\n', ),
'passstmt': ( '%|%rpass\n', ),
'raise_stmt0': ( '%|%rraise\n', ),
'importstmt': ( '%|import %c%x\n', 2, (2, (0, 1)), ),
'import': ( '%|import %c%x\n', 2, (2, (0, 1)), ),
'importfrom': ( '%|from %[2]{pattr}%x import %c\n', (2, (0, 1)), 3),
'importmultiple': ( '%|import%b %c%c\n', 0, 2, 3 ),
'list_for': (' for %c%x in %c%c', 2, (2, (1, )), 0, 3 ),
@@ -194,7 +194,7 @@ class FragmentsWalker(pysource.SourceWalker, object):
raise GenericASTTraversalPruningException
n_slice0 = n_slice1 = n_slice2 = n_slice3 = n_subscript = table_r_node
n_augassign_1 = n_print_item = exec_stmt = print_to_item = del_stmt = table_r_node
n_aug_assign_1 = n_print_item = exec_stmt = print_to_item = del_stmt = table_r_node
n_classdefco1 = n_classdefco2 = except_cond1 = except_cond2 = table_r_node
def n_passtmt(self, node):
@@ -490,7 +490,7 @@ class FragmentsWalker(pysource.SourceWalker, object):
self.set_pos_info(node, start, len(self.f.getvalue()))
self.prune()
def n_import_as(self, node):
def n_alias(self, node):
start = len(self.f.getvalue())
iname = node[0].pattr
@@ -543,8 +543,8 @@ class FragmentsWalker(pysource.SourceWalker, object):
self.indent_less()
self.prune() # stop recursing
def n_list_compr(self, node):
"""List comprehensions the way they are done in Python 2."""
def n_list_comp(self, node):
"""List comprehensions"""
p = self.prec
self.prec = 27
n = node[-1]
@@ -571,7 +571,7 @@ class FragmentsWalker(pysource.SourceWalker, object):
self.prec = 27
# FIXME: clean this up
if self.version > 3.0 and node == 'dictcomp':
if self.version > 3.0 and node == 'dict_comp':
cn = node[1]
elif self.version > 3.0 and node == 'generator_exp':
if node[0] == 'load_genexpr':
@@ -784,7 +784,7 @@ class FragmentsWalker(pysource.SourceWalker, object):
self.set_pos_info(node, start, len(self.f.getvalue()))
self.prune()
def n_setcomp(self, node):
def n_set_comp(self, node):
start = len(self.f.getvalue())
self.write('{')
if node[0] in ['LOAD_SETCOMP', 'LOAD_DICTCOMP']:
@@ -799,7 +799,7 @@ class FragmentsWalker(pysource.SourceWalker, object):
self.set_pos_info(node, start, len(self.f.getvalue()))
self.prune()
# FIXME: Not sure if below is general. Also, add dictcomp_func.
# FIXME: Not sure if below is general. Also, add dict_comp_func.
# 'setcomp_func': ("%|lambda %c: {%c for %c in %c%c}\n", 1, 3, 3, 1, 4)
def n_setcomp_func(self, node):
setcomp_start = len(self.f.getvalue())
@@ -1324,10 +1324,10 @@ class FragmentsWalker(pysource.SourceWalker, object):
self.write(')')
self.set_pos_info(node, start, len(self.f.getvalue()))
def n_mapexpr(self, node):
def n_dict(self, node):
"""
prettyprint a mapexpr
'mapexpr' is something like k = {'a': 1, 'b': 42 }"
prettyprint a dict
'dict' is something like k = {'a': 1, 'b': 42 }"
"""
p = self.prec
self.prec = 100
@@ -1340,7 +1340,7 @@ class FragmentsWalker(pysource.SourceWalker, object):
if self.version > 3.0:
if node[0].kind.startswith('kvlist'):
# Python 3.5+ style key/value list in mapexpr
# Python 3.5+ style key/value list in dict
kv_node = node[0]
l = list(kv_node)
i = 0
@@ -1355,7 +1355,7 @@ class FragmentsWalker(pysource.SourceWalker, object):
pass
pass
elif node[1].kind.startswith('kvlist'):
# Python 3.0..3.4 style key/value list in mapexpr
# Python 3.0..3.4 style key/value list in dict
kv_node = node[1]
l = list(kv_node)
if len(l) > 0 and l[0].kind == 'kv3':
@@ -1413,7 +1413,7 @@ class FragmentsWalker(pysource.SourceWalker, object):
self.prec = p
self.prune()
def n_build_list(self, node):
def n_list(self, node):
"""
prettyprint a list or tuple
"""
@@ -1431,7 +1431,7 @@ class FragmentsWalker(pysource.SourceWalker, object):
elif lastnode.startswith('ROT_TWO'):
self.write('('); endchar = ')'
else:
raise RuntimeError('Internal Error: n_build_list expects list or tuple')
raise RuntimeError('Internal Error: n_list expects list or tuple')
flat_elems = []
for elem in node:
@@ -1844,7 +1844,7 @@ if __name__ == '__main__':
# deparse_test(get_code_for_fn(gcd))
# deparse_test(get_code_for_fn(test))
# deparse_test(get_code_for_fn(FragmentsWalker.fixup_offsets))
# deparse_test(get_code_for_fn(FragmentsWalker.n_build_list))
# deparse_test(get_code_for_fn(FragmentsWalker.n_list))
print('=' * 30)
deparse_test_around(408, 'n_build_list', get_code_for_fn(FragmentsWalker.n_build_list))
deparse_test_around(408, 'n_list', get_code_for_fn(FragmentsWalker.n_build_list))
# deparse_test(inspect.currentframe().f_code)

View File

@@ -479,6 +479,9 @@ def make_function3(self, node, is_lambda, nested=1, codeNode=None):
- handle format tuple parameters
"""
if default:
if self.version >= 3.6:
value = default
else:
value = self.traverse(default, indent='')
maybe_show_ast_param_default(self.showast, name, value)
result = '%s=%s' % (name, value)
@@ -505,12 +508,17 @@ def make_function3(self, node, is_lambda, nested=1, codeNode=None):
defparams = node[:args_node.attr]
else:
default, kw, annotate, closure = args_node.attr
# FIXME: start here for Python 3.6 and above:
if default:
assert node[0] == 'expr', "expecting mkfunc default node to be an expr"
expr_node = node[0]
if (expr_node[0] == 'LOAD_CONST' and
isinstance(expr_node[0].attr, tuple)):
defparams = list(expr_node[0].attr)
elif expr_node[0] == 'list':
defparams = [self.traverse(n, indent='') for n in expr_node[0][:-1]]
else:
defparams = []
# if default:
# defparams = node[-(2 + kw + annotate + closure)]
# else:
# defparams = []
# FIXME: handle kw, annotate and closure
kw_args = 0
pass
@@ -536,7 +544,7 @@ def make_function3(self, node, is_lambda, nested=1, codeNode=None):
paramnames = list(code.co_varnames[:argc])
# defaults are for last n parameters, thus reverse
if not 3.0 <= self.version <= 3.1:
if not 3.0 <= self.version <= 3.1 or self.version >= 3.6:
paramnames.reverse(); defparams.reverse()
try:
@@ -557,10 +565,11 @@ def make_function3(self, node, is_lambda, nested=1, codeNode=None):
indent = self.indent
# build parameters
if self.version != 3.2:
tup = [paramnames, defparams]
params = [build_param(ast, name, default) for
name, default in map(lambda *tup:tup, *tup)]
params = [build_param(ast, name, default_value) for
name, default_value in map(lambda *tup:tup, *tup)]
if not 3.0 <= self.version <= 3.1 or self.version >= 3.6:
params.reverse() # back to correct order
if code_has_star_arg(code):

View File

@@ -359,7 +359,7 @@ class SourceWalker(GenericASTTraversal, object):
if version >= 3.0:
TABLE_DIRECT.update({
'funcdef_annotate': ( '\n\n%|def %c%c\n', -1, 0),
'function_def_annotate': ( '\n\n%|def %c%c\n', -1, 0),
'store_locals': ( '%|# inspect.currentframe().f_locals = __locals__\n', ),
})
@@ -384,7 +384,7 @@ class SourceWalker(GenericASTTraversal, object):
break
# FIXME: the real situation is that when derived from
# funcdef_annotate we the name has been filled in.
# function_def_annotate we the name has been filled in.
# But when derived from funcdefdeco it hasn't Would like a better
# way to distinquish.
if self.f.getvalue()[-4:] == 'def ':
@@ -444,7 +444,7 @@ class SourceWalker(GenericASTTraversal, object):
node.kind == 'async_call'
self.prune()
self.n_async_call = n_async_call
self.n_build_list_unpack = self.n_build_list
self.n_build_list_unpack = self.n_list
if version == 3.5:
def n_call(node):
@@ -472,7 +472,7 @@ class SourceWalker(GenericASTTraversal, object):
self.default(node)
self.n_call = n_call
def n_funcdef(node):
def n_function_def(node):
if self.version == 3.6:
code_node = node[0][0]
else:
@@ -487,7 +487,7 @@ class SourceWalker(GenericASTTraversal, object):
self.template_engine(('\n\n%|def %c\n', -2),
node)
self.prune()
self.n_funcdef = n_funcdef
self.n_function_def = n_function_def
def n_unmapexpr(node):
last_n = node[0][-1]
@@ -987,7 +987,7 @@ class SourceWalker(GenericASTTraversal, object):
self.indent_less()
self.prune()
def n_import_as(self, node):
def n_alias(self, node):
if self.version <= 2.1:
if len(node) == 2:
store = node[1]
@@ -1012,13 +1012,13 @@ class SourceWalker(GenericASTTraversal, object):
self.write(iname, ' as ', sname)
self.prune() # stop recursing
def n_importfrom(self, node):
def n_import_from(self, node):
relative_path_index = 0
if self.version >= 2.5 and node[relative_path_index].pattr > 0:
node[2].pattr = '.'*node[relative_path_index].pattr + node[2].pattr
self.default(node)
n_importstar = n_importfrom
n_import_from_star = n_import_from
def n_mkfunc(self, node):
@@ -1058,14 +1058,13 @@ class SourceWalker(GenericASTTraversal, object):
self.make_function(node, is_lambda=True, codeNode=node[-2])
self.prune() # stop recursing
def n_list_compr(self, node):
"""List comprehensions the way they are done in Python 2.
"""
def n_list_comp(self, node):
"""List comprehensions"""
p = self.prec
self.prec = 27
if self.version >= 2.7:
if self.is_pypy:
self.n_list_compr_pypy27(node)
self.n_list_comp_pypy27(node)
return
n = node[-1]
elif node[-1] == 'del_stmt':
@@ -1116,9 +1115,8 @@ class SourceWalker(GenericASTTraversal, object):
self.prec = p
self.prune() # stop recursing
def n_list_compr_pypy27(self, node):
"""List comprehensions the way they are done in PYPY Python 2.7.
"""
def n_list_comp_pypy27(self, node):
"""List comprehensions in PYPY."""
p = self.prec
self.prec = 27
if node[-1].kind == 'list_iter':
@@ -1166,7 +1164,7 @@ class SourceWalker(GenericASTTraversal, object):
self.prec = 27
# FIXME: clean this up
if self.version > 3.0 and node == 'dictcomp':
if self.version > 3.0 and node == 'dict_comp':
cn = node[1]
elif self.version < 2.7 and node == 'generator_exp':
if node[0] == 'LOAD_GENEXPR':
@@ -1241,7 +1239,7 @@ class SourceWalker(GenericASTTraversal, object):
self.write(')')
self.prune()
def n_setcomp(self, node):
def n_set_comp(self, node):
self.write('{')
if node[0] in ['LOAD_SETCOMP', 'LOAD_DICTCOMP']:
self.comprehension_walk3(node, 1, 0)
@@ -1344,7 +1342,7 @@ class SourceWalker(GenericASTTraversal, object):
code = Code(node[1].attr, self.scanner, self.currentclass)
ast = self.build_ast(code._tokens, code._customize)
self.customize(code._customize)
if node == 'setcomp':
if node == 'set_comp':
ast = ast[0][0][0]
else:
ast = ast[0][0][0][0][0]
@@ -1390,7 +1388,7 @@ class SourceWalker(GenericASTTraversal, object):
self.write(']')
self.prune()
n_dictcomp = n_setcomp
n_dict_comp = n_set_comp
def setcomprehension_walk3(self, node, collection_index):
"""List comprehensions the way they are done in Python3.
@@ -1447,6 +1445,8 @@ class SourceWalker(GenericASTTraversal, object):
if node == 'classdefdeco2':
if self.version >= 3.6:
class_name = node[1][1].pattr
elif self.version <= 3.3:
class_name = node[2][0].pattr
else:
class_name = node[1][2].pattr
buildclass = node
@@ -1556,7 +1556,7 @@ class SourceWalker(GenericASTTraversal, object):
n_classdefdeco2 = n_classdef
def print_super_classes(self, node):
if not (node == 'build_list'):
if not (node == 'list'):
return
n_subclasses = len(node[:-1])
@@ -1605,10 +1605,10 @@ class SourceWalker(GenericASTTraversal, object):
self.write(')')
def n_mapexpr(self, node):
def n_dict(self, node):
"""
prettyprint a mapexpr
'mapexpr' is something like k = {'a': 1, 'b': 42}"
prettyprint a dict
'dict' is something like k = {'a': 1, 'b': 42}"
We will source-code use line breaks to guide us when to break.
"""
p = self.prec
@@ -1621,7 +1621,7 @@ class SourceWalker(GenericASTTraversal, object):
if self.version >= 3.0 and not self.is_pypy:
if node[0].kind.startswith('kvlist'):
# Python 3.5+ style key/value list in mapexpr
# Python 3.5+ style key/value list in dict
kv_node = node[0]
l = list(kv_node)
i = 0
@@ -1644,7 +1644,7 @@ class SourceWalker(GenericASTTraversal, object):
pass
pass
elif len(node) > 1 and node[1].kind.startswith('kvlist'):
# Python 3.0..3.4 style key/value list in mapexpr
# Python 3.0..3.4 style key/value list in dict
kv_node = node[1]
l = list(kv_node)
if len(l) > 0 and l[0].kind == 'kv3':
@@ -1756,7 +1756,7 @@ class SourceWalker(GenericASTTraversal, object):
self.prec = p
self.prune()
def n_build_list(self, node):
def n_list(self, node):
"""
prettyprint a list or tuple
"""
@@ -1847,7 +1847,7 @@ class SourceWalker(GenericASTTraversal, object):
self.prune()
return
n_build_set = n_build_list
n_build_set = n_list
def n_unpack(self, node):
if node[0].kind.startswith('UNPACK_EX'):
@@ -2074,9 +2074,9 @@ class SourceWalker(GenericASTTraversal, object):
TABLE_R[k] = entry
pass
# handled by n_mapexpr:
# handled by n_dict:
# if op == 'BUILD_SLICE': TABLE_R[k] = ('%C' , (0,-1,':'))
# handled by n_build_list:
# handled by n_list:
# if op == 'BUILD_LIST': TABLE_R[k] = ('[%C]' , (0,-1,', '))
# elif op == 'BUILD_TUPLE': TABLE_R[k] = ('(%C%,)', (0,-1,', '))
pass