Compare commits

..

36 Commits
3.7.1 ... 3.7.3

Author SHA1 Message Date
rocky
6cb6e45789 Get ready for release 3.7.3 2020-07-25 15:34:37 -04:00
rocky
024a81c053 Fix condition sense in except_handler.py reduction 2020-07-23 11:03:43 -04:00
rocky
a0f93f7ad9 Comment last change 2020-07-22 06:44:34 -04:00
rocky
d3d67441d1 Add VERSION in a pydoc-friendly way 2020-07-22 06:38:47 -04:00
rocky
a215ee2f00 Use "co_consts" in docstring detection.
Note: this is an upheaval because we need to pass "code" or at least
"code.co_consts" to the docstring detection routine
2020-07-21 10:31:07 -04:00
rocky
f62512dd65 Clarify a warning message 2020-07-19 20:36:18 -04:00
rocky
0f80c38530 Better doc string detection
A bug in 2.7 test_descr.py revealed a problem with the way we were
detecting docstrings.

        __doc__ = DocDescr()

was getting confused with a docstring.

This program also reveals other bugs in 3.2+ but we'll deal with that
in another commit.
2020-07-19 20:31:50 -04:00
rocky
bd07de5172 Issue template tweaking again 2020-07-16 07:21:24 -04:00
rocky
e36945e2d9 Another tweak 2020-07-16 07:09:48 -04:00
rocky
25df0bdb76 Spelling typo 2020-07-16 07:04:25 -04:00
rocky
29ceb3fe05 Tweak again 2020-07-16 07:03:01 -04:00
rocky
fd7e04fa5d Small tweaks 2020-07-16 07:00:13 -04:00
rocky
5079164db2 Add reduce check for aug_assign1 2020-07-07 09:54:57 -04:00
rocky
815ae2c5cd for/else detection for older 2.x Pythons 2020-07-06 18:38:14 -04:00
rocky
54932d36fa Small tweaks...
add-test.py: wasn't handling optimize correctly. Handle python version better
parse27.py: dyslexia
01_for_else_try_else.py: bug in found in 1.4 anydbm.py which we will
address soon
2020-07-06 18:19:06 -04:00
rocky
fa1d7e4af4 Tweak. 2020-07-06 14:44:25 -04:00
rocky
bfd4b4cd68 Update bug-fixing expectations 2020-07-06 14:35:24 -04:00
rocky
430fd2fa85 Update README.rst status on early Pythons 2020-07-06 12:02:59 -04:00
rocky
ef59b9c304 Forelse reduction checks on 2.6 2020-07-06 10:09:42 -04:00
rocky
084e183577 Add reduce check for 2.7 except_handler range 2020-07-05 22:18:07 -04:00
rocky
7c14cf2d66 Add missing ref URLs 2020-07-05 09:55:43 -04:00
rocky
1d3fdbb4cd Update status 2020-07-05 00:59:16 -04:00
rocky
b21e8b8b57 Get ready for release 3.7.2 2020-06-27 23:08:46 -04:00
rocky
4007b8b702 Back off "or" check using instructions vs opcodes 2020-06-27 11:44:23 -04:00
rocky
598b58796d Back off buggy "or" check 2020-06-27 11:33:46 -04:00
rocky
f7bad891a4 Last commit fixed test_pep352.py 2020-06-27 11:22:53 -04:00
rocky
357f28dd89 Add "comp_if_not" for 2.6- 2020-06-27 11:16:47 -04:00
rocky
5cc572147a Handle more ifelse reduction rules patterns 2020-06-27 09:10:48 -04:00
rocky
11be90758f Workaround bug detecting MAKE_FUNCTION docstrings 2020-06-26 07:17:31 -04:00
rocky
e3720515ae Adjust for newer xdis 2020-06-21 20:20:25 -04:00
rocky
7dec354a47 Merge branch 'master' of github.com:rocky/python-uncompyle6 2020-06-17 10:15:07 -04:00
rocky
2a8daca25d Fix broken __doc__ transform yet again...
Hopefully by using first_child() we have something more robust now.
2020-06-17 10:12:56 -04:00
rocky
7799819cad Add another 3.7 stdlib exclusion test 2020-06-17 05:42:10 -04:00
rocky
c6c50b5dfb Disable compile-farm 3.8.3 checking 2020-06-17 05:29:04 -04:00
rocky
d357898bbf Towards fixing a 3.8 try except-as bug 2020-06-15 06:03:28 -04:00
rocky
c4e7ddf90a Administrivia 2020-06-12 21:29:32 -04:00
48 changed files with 465 additions and 205 deletions

View File

@@ -4,28 +4,40 @@ about: Tell us about uncompyle6 bugs
---
<!-- __Note:__ Have you read https://github.com/rocky/python-uncompyle6/blob/master/HOW-TO-REPORT-A-BUG.md ?
<!-- __Note:__ Bugs are not for asking questions about a problem you
are trying to solve that involve the use of uncompyle6 along the way,
although I may be more tolerent of this if you sponsor the project.
Also, the unless you are a sponsor of the project, it may take a
while, maybe a week or so, before the bug report is noticed, let alone
acted upon.
To set expectations, some legitimate bugs can take years
to fix, but they eventually do get fixed. Funding the project was
added to address the problem that there are lots of people seeking
help and reporting bugs, but few people who are willing or capable of
providing help or fixing bugs.
Finally, have you read https://github.com/rocky/python-uncompyle6/blob/master/HOW-TO-REPORT-A-BUG.md
?
Please remove any of the optional sections if they are not applicable.
Prerequisites
Prerequisites/Caveats
* Make sure the bytecode you have can be disassembled with a
disassembler and produces valid results.
* Don't put bytecode and corresponding source code on any service that
requires registration to download.
* When you open a bug report there is no privacy. If the legitimacy of
the activity is deemed suspicous, I may flag it as suspicious,
* When you open a bug report there is no privacy. If you need privacy, then
contact me by email and explain who you are and the need for privacy.
But be mindful that you may be asked to sponsor the project for the
personal and private help that you are requesting.
* If the legitimacy of the activity is deemed suspicous, I may flag it as suspicious,
making the issue even more easy to detect.
Bug reports that violate a prerequisite may be discarded.
Note that there are way more bug-fix requestors than there are bug
fixers. If you want you need more immediate, confidential or urgent
assistance
http://www.crazy-compilers.com/decompyle/ offers a byte-code
decompiler service for versions of Python up to 2.6.
Bug reports that violate the above may be discarded.
-->

View File

@@ -30,31 +30,9 @@ files that I know of that will cause problems. See, for example, the
list in
[`test/stdlib/runtests.sh`](https://github.com/rocky/python-uncompyle6/blob/master/test/stdlib/runtests.sh).
But I understand: you would the bugs _you_ encounter addressed before
all the other known bugs.
There are far more bug reporters than there are bug fixers.
From my standpoint, the good thing about the bugs listed in
`runtests.sh` is that each test case is small and isolated to a single
kind of problem. And I'll tend to fix easier, more isolated cases than
generic "something's wrong" kinds of bugs where I'd have to do a bit
of work to figure out what's up, if not use some sort of mind reading,
make some guesses, and perform some experiments to see if the guesses
are correct. I can't read minds, nor am I into guessing games; I'd
rather devote the effort spent instead towards fixing bugs that are
precisely defined.
And it often turns out that by just fixing the well-defined and
prescribed cases, the ill-defined amorphous cases as well will get
handled as well.
In sum, you may need to do some work to have the bug you have found
handled before the hundreds of other bugs, and other things I could be
doing.
No one is getting paid to work to work on this project, let alone the
bugs you may have an interest in. If you require decompiling bytecode
immediately, consider using a decompilation service, listed further
down in this document.
Unless you are a sponsor of this project, it may take a while, maybe a week or so, before the bug report is noticed, let alone acted upon. Things eventually get fixed, but it may take years.
# Is it really a bug?
@@ -176,11 +154,7 @@ provide the input command and the output from that, please give:
## But I don't *have* the source code!
Sure, I get it. No problem. There is Python assembly code on parse
errors, so simply by hand decompile that. To get a full disassembly,
use `pydisasm` from the [xdis](https://pypi.python.org/pypi/xdis)
package. Opcodes are described in the documentation for
the [dis](https://docs.python.org/3.6/library/dis.html) module.
There is Python assembly code on parse errors, so simply by hand decompile that. To get a full disassembly, use `pydisasm` from the [xdis](https://pypi.python.org/pypi/xdis) package. Opcodes are described in the documentation for the[dis](https://docs.python.org/3.6/library/dis.html) module.
### But I don't *have* the source code and am incapable of figuring how to do a hand disassembly!
@@ -189,7 +163,7 @@ disassemble Python bytecode. And as Richard Feynman once said, "What
one fool can learn, so can another."
If this is too difficult, or too time consuming, or not of interest to
you, then perhaps what require is a decompilation service. [Crazy
you, then you might consider sponsoring the project. [Crazy
Compilers](http://www.crazy-compilers.com/decompyle/) offers a
byte-code decompiler service for versions of Python up to 2.6. (If
there are others around let me know and I'll list them here.)
@@ -217,10 +191,7 @@ likely the problem will be fixed and fixed sooner.
# Karma
I realize that following the instructions given herein puts a bit of
burden on the bug reporter. In my opinion, this is justified as
attempts to balance somewhat the burden and effort needed to fix the
bug and the attempts to balance number of would-be bug reporters with
the number of bug fixers. Better bug reporters are more likely to move
burden on the bug reporter. This is justified as attempts to balance somewhat the burden and effort needed to fix the bug and the attempts to balance number of would-be bug reporters with the number of bug fixers. Better bug reporters are more likely to move
in the category of bug fixers.
The barrier to reporting a big is pretty small: all you really need is
@@ -228,22 +199,14 @@ a github account, and the ability to type something after clicking
some buttons. So the reality is that many people just don't bother to
read these instructions, let alone follow it to any simulacrum.
And the reality is also that bugs sometimes get fixed even though
these instructions are not followed.
That said, bugs sometimes get fixed even though these instructions are not followed.
So one factors I may take into consideration is the bug reporter's karma.
I may take into consideration is the bug reporter's karma.
* Have you demonstrably contributed to open source? I may look at your github profile to see what contributions you have made, how popular those contributions are, or how popular you are.
* How appreciative are you? Have you starred this project that you are seeking help from? Have you starred _any_ github project? And the above two kind of feed into ...
* Attitude. Some people feel that they are doing me and the world a great favor by just pointing out that there is a problem whose solution would greatly benefit them. (This might account partially those that have this attitude often don't read or follow instructions such as those given here.)
* Have you demonstrably contributed to open source? I may look at your
github profile to see what contributions you have made, how popular
those contributions are, or how popular you are.
* How appreciative are you? Have you starred this project that you are
seeking help from? Have you starred _any_ github project? And the above
two kind of feed into ...
* Attitude. Some people feel that they are doing me and the world a
great favor by just pointing out that there is a problem whose solution
would greatly benefit them. Perhaps this is why they feel that
instructions are not to be followed by them, nor any need for
showing evidence gratitude when help is offered them.
# Confidentiality of Bug Reports
@@ -255,6 +218,8 @@ remains would not be an issue.
However feel free to remove any comments, and modify variable names
or constants in the source code.
If there is some legitimate reason to keep confidentiality, you can contact me by email to explain the extenuating circumstances. However I tend to discard without reading anonymous email.
# Ethics
I do not condone using this program for unethical or illegal purposes.

21
NEWS.md
View File

@@ -1,3 +1,24 @@
3.7.3: 2020-7-25
================
Mostly small miscellaneous bug fixes
* `__doc__ = DocDescr()` from `test_descr.py` was getting confused as a docstring.
* detect 2.7 exchandler range better
* Add for .. else reduction checks on 2.6 and before
* Add reduce check for 2.7 augmented assign
* Add VERSION in a pydoc-friendly way
3.7.2: 2020-6-27
================
* Use newer xdis
* Docstrings (again) which were broken again on earlier Python
* Fix 2.6 and 2.7 decompilation bug in handling "list if" comprehensions
3.7.1: 2020-6-12 Fleetwood66
====================================================

View File

@@ -68,10 +68,9 @@ are syntactically correct by running the Python interpreter for that
bytecode version. Finally, in cases where the program has a test for
itself, we can run the check on the decompiled code.
We are serious about testing, and use automated processes to find
bugs. In the issue trackers for other decompilers, you will find a
number of bugs we've found along the way. Very few to none of them are
fixed in the other decompilers.
We use an automated processes to find bugs. In the issue trackers for
other decompilers, you will find a number of bugs we've found along
the way. Very few to none of them are fixed in the other decompilers.
Requirements
------------
@@ -172,15 +171,11 @@ All of the Python decompilers that I have looked at have problems
decompiling Python's control flow. In some cases we can detect an
erroneous decompilation and report that.
Python support is strongest in Python 2 for 2.7 and drops off as you
get further away from that. Support is also probably pretty good for
python 2.3-2.4 since a lot of the goodness of early the version of the
decompiler from that era has been preserved (and Python compilation in
that era was minimal)
Python support is pretty good for Python 2
There is some work to do on the lower end Python versions which is
more difficult for us to handle since we don't have a Python
interpreter for versions 1.6, and 2.0.
On the lower end of Python versions, decompilation seems pretty good although
we don't have any automated testing in place for Python's distributed tests.
Also, we don't have a Python interpreter for versions 1.6, and 2.0.
In the Python 3 series, Python support is is strongest around 3.4 or
3.3 and drops off as you move further away from those versions. Python
@@ -195,7 +190,7 @@ added. So in sum handling control flow by ad hoc means as is currently
done is worse.
Between Python 3.5, 3.6, 3.7 there have been major changes to the
:code:`MAKE_FUNCTION` and :code:`CALL_FUNCTION` instructions. Python
:code:`MAKE_FUNCTION` and :code:`CALL_FUNCTION` instructions.
Python 3.8 removes :code:`SETUP_LOOP`, :code:`SETUP_EXCEPT`,
:code:`BREAK_LOOP`, and :code:`CONTINUE_LOOP`, instructions which may
@@ -215,17 +210,43 @@ which use their own magic and encrypt bytecode. With the exception of
the Dropbox's old Python 2.5 interpreter this kind of thing is not
handled.
We also don't handle PJOrion_ obfuscated code. For that try: PJOrion
Deobfuscator_ to unscramble the bytecode to get valid bytecode before
trying this tool. This program can't decompile Microsoft Windows EXE
files created by Py2EXE_, although we can probably decompile the code
after you extract the bytecode properly. For situations like this, you
might want to consider a decompilation service like `Crazy Compilers
<http://www.crazy-compilers.com/decompyle/>`_. Handling
pathologically long lists of expressions or statements is slow.
We also don't handle PJOrion_ or otherwise obfuscated code. For
PJOrion try: PJOrion Deobfuscator_ to unscramble the bytecode to get
valid bytecode before trying this tool. This program can't decompile
Microsoft Windows EXE files created by Py2EXE_, although we can
probably decompile the code after you extract the bytecode
properly. Handling pathologically long lists of expressions or
statements is slow. We don't handle Cython_ or MicroPython which don't
use bytecode.
There are numerous bugs in decompilation. And that's true for every
other CPython decompiler I have encountered, even the ones that
claimed to be "perfect" on some particular version like 2.4.
There is lots to do, so please dig in and help.
As Python progresses decompilation also gets harder because the
compilation is more sophisticated and the language itself is more
sophisticated. I suspect that attempts there will be fewer ad-hoc
attempts like unpyc37_ (which is based on a 3.3 decompiler) simply
because it is harder to do so. The good news, at least from my
standpoint, is that I think I understand what's needed to address the
problems in a more robust way. But right now until such time as
project is better funded, I do not intend to make any serious effort
to support Python versions 3.8 or 3.9, including bugs that might come
in. I imagine at some point I may be interested in it.
You can easily find bugs by running the tests against the standard
test suite that Python uses to check itself. At any given time, there are
dozens of known problems that are pretty well isolated and that could
be solved if one were to put in the time to do so. The problem is that
there aren't that many people who have been working on bug fixing.
Some of the bugs in 3.7 and 3.8 are simply a matter of back-porting
the fixes in decmopyle3.
You may run across a bug, that you want to report. Please do so. But
be aware that it might not get my attention for a while. If you
sponsor or support the project in some way, I'll prioritize your
issues above the queue of other things I might be doing instead.
See Also
--------
@@ -242,6 +263,7 @@ See Also
* https://github.com/zrax/pycdc : The README for this C++ code says it aims to support all versions of Python. It is best for Python versions around 2.7 and 3.3 when the code was initially developed. Accuracy for current versions of Python3 and early versions of Python is lacking. Without major effort, it is unlikely it can be made to support current Python 3. See its `issue tracker <https://github.com/zrax/pycdc/issues>`_ for details. Currently lightly maintained.
.. _Cython: https://en.wikipedia.org/wiki/Cython
.. _trepan: https://pypi.python.org/pypi/trepan2g
.. _compiler: https://pypi.python.org/pypi/spark_parser
.. _HISTORY: https://github.com/rocky/python-uncompyle6/blob/master/HISTORY.md

View File

@@ -69,7 +69,7 @@ entry_points = {
]}
ftp_url = None
install_requires = ["spark-parser >= 1.8.9, < 1.9.0",
"xdis >= 4.6.1, < 4.8.0"]
"xdis >= 4.7.0, <5.1.0"]
license = "GPL3"
mailing_list = "python-debugger@googlegroups.com"

View File

@@ -2,17 +2,17 @@
**Table of Contents**
- [Get latest sources:](#get-latest-sources)
- [Change version in uncompyle6/version.py](#change-version-in-uncompyle6versionpy)
- [Change version in uncompyle6/version.py:](#change-version-in-uncompyle6versionpy)
- [Update ChangeLog:](#update-changelog)
- [Update NEWS from ChangeLog:](#update-news-from-changelog)
- [Update NEWS.md from ChangeLog:](#update-newsmd-from-changelog)
- [Make sure pyenv is running and check newer versions](#make-sure-pyenv-is-running-and-check-newer-versions)
- [Switch to python-2.4, sync that up and build that first since it creates a tarball which we don't want.](#switch-to-python-24-sync-that-up-and-build-that-first-since-it-creates-a-tarball-which-we-dont-want)
- [Update NEWS from master branch](#update-news-from-master-branch)
- [Check against all versions](#check-against-all-versions)
- [Check against older versions](#check-against-older-versions)
- [Make packages and tag](#make-packages-and-tag)
- [Upload single package and look at Rst Formating](#upload-single-package-and-look-at-rst-formating)
- [Upload rest of versions](#upload-rest-of-versions)
- [Push tags:](#push-tags)
- [Check package on github](#check-package-on-github)
- [Release on Github](#release-on-github)
- [Get onto PyPI](#get-onto-pypi)
- [Update tags:](#update-tags)
<!-- markdown-toc end -->
# Get latest sources:
@@ -39,7 +39,7 @@
# Make sure pyenv is running and check newer versions
$ pyenv local && source admin-tools/check-newer-versions.sh
$ admin-tools/check-newer-versions.sh
# Switch to python-2.4, sync that up and build that first since it creates a tarball which we don't want.
@@ -50,37 +50,46 @@
# Check against older versions
$ source admin-tools/check-older-versions.sh
$ admin-tools/check-older-versions.sh
# Make packages and tag
$ . ./admin-tools/make-dist-older.sh
$ pyenv local 3.8.4
$ twine check dist/uncompyle6-$VERSION*
$ ./admin-tools/make-dist-newer.sh
$ twine check dist/uncompyle6-$VERSION*
# Check package on github
$ [[ ! -d /tmp/gittest ]] && mkdir /tmp/gittest; pushd /tmp/gittest
$ pyenv local 3.8.3
$ twine check dist/uncompyle6-$VERSION*
$ . ./admin-tools/make-dist-newer.sh
$ twine check dist/uncompyle6-$VERSION*
$ pip install -e git://github.com/rocky/python-uncompyle6.git#egg=uncompyle6
$ uncompyle6 --help
$ pip uninstall uncompyle6
$ popd
# Upload single package and look at Rst Formating
$ twine check dist/uncompyle6-${VERSION}*
$ twine upload dist/uncompyle6-${VERSION}-py3.3.egg
# Upload rest of versions
$ twine upload dist/uncompyle6-${VERSION}*
# Release on Github
Goto https://github.com/rocky/python-uncompyle6/releases
# Push tags:
Now check the *tagged* release. (Checking the untagged release was previously done).
Todo: turn this into a script in `admin-tools`
$ pushd /tmp/gittest
$ pip install -e git://github.com/rocky/python-uncompyle6.git@$VERSION#egg=uncompyle6
$ uncompyle6 --help
$ pip uninstall uncompyle6
$ popd
# Get onto PyPI
$ twine upload dist/uncompyle6-${VERSION}*
# Update tags:
$ git push --tags
# Check on a VM
$ cd /virtual/vagrant/virtual/vagrant/ubuntu-zesty
$ vagrant up
$ vagrant ssh
$ pyenv local 3.5.2
$ pip install --upgrade uncompyle6
$ exit
$ vagrant halt
$ git pull --tags

View File

@@ -5,4 +5,4 @@ if [[ $0 == ${BASH_SOURCE[0]} ]] ; then
echo "This script should be *sourced* rather than run directly through bash"
exit 1
fi
export PYVERSIONS='3.5.9 3.6.10 2.6.9 3.3.7 2.7.18 3.2.6 3.1.5 3.4.10 3.7.7 3.8.3'
export PYVERSIONS='3.5.9 3.6.10 2.6.9 3.3.7 2.7.18 3.2.6 3.1.5 3.4.10 3.7.7 3.8.4'

View File

@@ -3,14 +3,22 @@
"""
import os, sys, py_compile
assert len(sys.argv) >= 2
assert (2 <= len(sys.argv) <= 4)
version = sys.version[0:3]
vers = sys.version_info[:2]
if sys.argv[1] in ("--run", "-r"):
suffix = "_run"
py_source = sys.argv[2:]
i = 2
else:
suffix = ""
py_source = sys.argv[1:]
i = 1
try:
optimize = int(sys.argv[-1])
py_source = sys.argv[i:-1]
except:
optimize = 2
for path in py_source:
short = os.path.basename(path)
@@ -19,7 +27,10 @@ for path in py_source:
else:
cfile = "bytecode_%s%s/%s" % (version, suffix, short) + "c"
print("byte-compiling %s to %s" % (path, cfile))
optimize = 2
py_compile.compile(path, cfile, optimize=optimize)
if isinstance(version, str) or version >= (2, 6, 0):
optimize = optimize
if vers > (3, 1):
py_compile.compile(path, cfile, optimize=optimize)
else:
py_compile.compile(path, cfile)
if vers >= (2, 6):
os.system("../bin/uncompyle6 -a -T %s" % cfile)

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

Binary file not shown.

View File

@@ -51,7 +51,7 @@ for VERSION in $PYVERSIONS ; do
LOGFILE=/tmp/${MAIN}-$VERSION-$$.log
case "$VERSION" in
3.7.7 | 3.8.2 | 3.1.5 | 3.0.1 )
3.7.7 | 3.8.3 | 3.1.5 | 3.0.1 )
continue
;;
3.5.9 )

View File

@@ -0,0 +1,17 @@
# Adapted from 1.4 anydbm
"""This program is self-checking!"""
def scan(items):
for i in items:
try:
5 / i
except:
continue
else:
break
else:
return 2
return i
assert scan((0, 1)) == 1
assert scan((0, 0)) == 2
assert scan((3, 2, 1)) == 3

View File

@@ -0,0 +1,9 @@
# From 2.7.17 test_bdb.py
# The problem was detecting a docstring at the begining of the module
# It must be detected and change'd or else the "from __future__" below
# is invalid.
# Note that this has to be compiled with optimation < 2 or else optimization
# will remove the docstring
"""Rational, infinite-precision, real numbers."""
from __future__ import division

View File

@@ -0,0 +1,27 @@
# From 2.7 test_descr.py
# Testing __doc__ descriptor...
# The bug in decompilation was erroneously matching
# __doc__ = as a docstring
"""This program is self-checking!"""
def test_doc_descriptor():
# Testing __doc__ descriptor...
# Python SF bug 542984
class DocDescr(object):
def __get__(self, object, otype):
if object:
object = object.__class__.__name__ + ' instance'
if otype:
otype = otype.__name__
return 'object=%s; type=%s' % (object, otype)
class OldClass:
__doc__ = DocDescr()
class NewClass(object):
__doc__ = DocDescr()
assert OldClass.__doc__ == 'object=None; type=OldClass'
assert OldClass().__doc__ == 'object=OldClass instance; type=OldClass'
assert NewClass.__doc__ == 'object=None; type=NewClass'
assert NewClass().__doc__ == 'object=NewClass instance; type=NewClass'
test_doc_descriptor()

View File

@@ -45,7 +45,6 @@ SKIP_TESTS=(
[test_grammar.py]=1 # Too many stmts. Handle large stmts
[test_grp.py]=1 # Long test - might work Control flow?
[test_pep247.py]=1 # Long test - might work? Control flow?
[test_pwd.py]=1 # Long test - might work? Control flow?
[test_socketserver.py]=1 # -- test takes too long to run: 40 seconds
[test_threading.py]=1 # test takes too long to run: 11 seconds
[test_thread.py]=1 # test takes too long to run: 36 seconds

View File

@@ -55,12 +55,9 @@ SKIP_TESTS=(
[test_ossaudiodev.py]=1 # it fails on its own
[test_pdb.py]=1 # Line-number specific
[test_pep277.py]=1 # it fails on its own
[test_pep352.py]=1 # Investigate
[test_plistlib.py]=1 # it fails on its own
[test_pwd.py]=1 # Long test - might work? Control flow?
[test_pyclbr.py]=1 # Investigate
[test_rgbimg.py]=1 # it fails on its own
[test_queue.py]=1 # Control flow?
[test_scriptpackages.py]=1 # it fails on its own
[test_socketserver.py]=1 # Too long to run - 42 seconds
[test_sqlite.py]=1 # it fails on its own

View File

@@ -7,7 +7,6 @@ SKIP_TESTS=(
# assert 0 # shouldn't reach here.
[test_shutil.py]=1
[test___all__.py]=1 # it fails on its own
[test___all__.py]=1 # it fails on its own
[test_aepack.py]=1 # Fails on its own
@@ -60,9 +59,7 @@ SKIP_TESTS=(
[test_ossaudiodev.py]=1 # it fails on its own
[test_pep277.py]=1 # it fails on its own
[test_pep352.py]=1 # Investigate
[test_pyclbr.py]=1 # Investigate
[test_pwd.py]=1 # Long test - might work? Control flow?
[test_py3kwarn.py]=1 # it fails on its own
[test_scriptpackages.py]=1 # it fails on its own

View File

@@ -19,7 +19,6 @@ SKIP_TESTS=(
[test_modulefinder.py]=1 # FIX
[test_multiprocessing.py]=1 # On uncompyle2, takes 24 secs
[test_poll.py]=1 # test takes too long to run: 11 seconds
[test_pwd.py]=1 # Takes too long
[test_regrtest.py]=1 #
[test_runpy.py]=1 # Long and fails on its own
[test_select.py]=1 # Runs okay but takes 11 seconds

View File

@@ -63,6 +63,7 @@ SKIP_TESTS=(
[test_faulthandler.py]=1 # test takes too long before decompiling
[test_fileinput.py]=1 # Test assertion failures
[test_finalization.py]=1 # if/else logic
[test_frame.py]=1 # test assertion errors
[test_ftplib.py]=1 # parse error
[test_fstring.py]=1 # need to disambiguate leading fstrings from docstrings

View File

@@ -28,20 +28,25 @@
import sys
__docformat__ = 'restructuredtext'
__docformat__ = "restructuredtext"
PYTHON3 = (sys.version_info >= (3, 0))
from uncompyle6.version import VERSION
# This ensures VERSION will appear in pydoc
__version__ = VERSION
PYTHON3 = sys.version_info >= (3, 0)
# We do this crazy way to support Python 2.6 which
# doesn't support version_major, and has a bug in
# floating point so we can't divide 26 by 10 and get
# 2.6
PYTHON_VERSION = sys.version_info[0] + (sys.version_info[1] / 10.0)
PYTHON_VERSION_STR = "%s.%s" % (sys.version_info[0], sys.version_info[1])
PYTHON_VERSION_STR = "%s.%s" % (sys.version_info[0], sys.version_info[1])
IS_PYPY = '__pypy__' in sys.builtin_module_names
IS_PYPY = "__pypy__" in sys.builtin_module_names
if hasattr(sys, 'setrecursionlimit'):
if hasattr(sys, "setrecursionlimit"):
# pyston doesn't have setrecursionlimit
sys.setrecursionlimit(5000)

View File

@@ -101,7 +101,7 @@ def decompile(
)
if PYTHON_VERSION < 3.0 and bytecode_version >= 3.0:
write(
'# Warning: this version has problems handling the Python 3 "byte" type in constants properly.\n'
'# Warning: this version of Python has problems handling the Python 3 "byte" type in constants properly.\n'
)
if co.co_filename:
@@ -119,12 +119,10 @@ def decompile(
mapstream = _get_outstream(mapstream)
deparsed = deparse_code_with_map(
bytecode_version,
co,
out,
showasm,
showast,
showgrammar,
bytecode_version,
debug_opts,
code_objects=code_objects,
is_pypy=is_pypy,
)

View File

@@ -623,7 +623,7 @@ class PythonParser(GenericASTBuilder):
"""
def parse(p, tokens, customize):
def parse(p, tokens, customize, code):
p.customize_grammar_rules(tokens, customize)
ast = p.parse(tokens)
# p.cleanup()
@@ -878,7 +878,7 @@ def python_parser(
# parser_debug = {'rules': True, 'transition': True, 'reduce' : True,
# 'showstack': 'full'}
p = get_python_parser(version, parser_debug)
return parse(p, tokens, customize)
return parse(p, tokens, customize, co)
if __name__ == "__main__":

View File

@@ -1,4 +1,4 @@
# Copyright (c) 2017-2019 Rocky Bernstein
# Copyright (c) 2017-2020 Rocky Bernstein
"""
spark grammar differences over Python2 for Python 2.6.
"""
@@ -6,6 +6,7 @@ spark grammar differences over Python2 for Python 2.6.
from uncompyle6.parser import PythonParserSingle
from spark_parser import DEFAULT_DEBUG as PARSER_DEFAULT_DEBUG
from uncompyle6.parsers.parse2 import Python2Parser
from uncompyle6.parsers.reducecheck import (except_handler, tryelsestmt)
class Python26Parser(Python2Parser):
@@ -24,7 +25,11 @@ class Python26Parser(Python2Parser):
except_handler ::= JUMP_FORWARD COME_FROM except_stmts
come_froms_pop END_FINALLY come_froms
except_handler ::= JUMP_FORWARD COME_FROM except_stmts END_FINALLY
except_handler ::= JUMP_FORWARD COME_FROM except_stmts
END_FINALLY
except_handler ::= JUMP_FORWARD COME_FROM except_stmts
POP_TOP END_FINALLY
come_froms
except_handler ::= jmp_abs COME_FROM except_stmts
@@ -33,6 +38,7 @@ class Python26Parser(Python2Parser):
except_handler ::= jmp_abs COME_FROM except_stmts
END_FINALLY JUMP_FORWARD
# Sometimes we don't put in COME_FROM to the next statement
# like we do in 2.7. Perhaps we should?
try_except ::= SETUP_EXCEPT suite_stmts_opt POP_BLOCK
@@ -87,8 +93,8 @@ class Python26Parser(Python2Parser):
cf_jb_cf_pop ::= _come_froms JUMP_BACK come_froms POP_TOP
bp_come_from ::= POP_BLOCK COME_FROM
jb_pb_come_from ::= JUMP_BACK bp_come_from
pb_come_from ::= POP_BLOCK COME_FROM
jb_pb_come_from ::= JUMP_BACK pb_come_from
_ifstmts_jump ::= c_stmts_opt JUMP_FORWARD COME_FROM POP_TOP
_ifstmts_jump ::= c_stmts_opt JUMP_FORWARD come_froms POP_TOP COME_FROM
@@ -142,7 +148,7 @@ class Python26Parser(Python2Parser):
while1stmt ::= SETUP_LOOP l_stmts_opt CONTINUE _come_froms
whilestmt ::= SETUP_LOOP testexpr l_stmts_opt jb_pop POP_BLOCK _come_froms
whilestmt ::= SETUP_LOOP testexpr l_stmts_opt jb_cf_pop bp_come_from
whilestmt ::= SETUP_LOOP testexpr l_stmts_opt jb_cf_pop pb_come_from
whilestmt ::= SETUP_LOOP testexpr l_stmts_opt jb_cf_pop POP_BLOCK
whilestmt ::= SETUP_LOOP testexpr returns POP_BLOCK COME_FROM
@@ -232,7 +238,10 @@ class Python26Parser(Python2Parser):
comp_for ::= SETUP_LOOP expr for_iter store comp_iter jb_pb_come_from
comp_body ::= gen_comp_body
comp_iter ::= comp_if_not
comp_if_not ::= expr jmp_true comp_iter
comp_body ::= gen_comp_body
for_block ::= l_stmts_opt _come_froms POP_TOP JUMP_BACK
@@ -342,18 +351,30 @@ class Python26Parser(Python2Parser):
WITH_CLEANUP END_FINALLY
""")
super(Python26Parser, self).customize_grammar_rules(tokens, customize)
self.reduce_check_table = {
"except_handler": except_handler,
"tryelsestmt": tryelsestmt,
"tryelsestmtl": tryelsestmt,
}
self.check_reduce['and'] = 'AST'
self.check_reduce['assert_expr_and'] = 'AST'
self.check_reduce["except_handler"] = "tokens"
self.check_reduce["ifstmt"] = "tokens"
self.check_reduce["ifelsestmt"] = "AST"
self.check_reduce["forelselaststmtl"] = "tokens"
self.check_reduce["forelsestmt"] = "tokens"
self.check_reduce['list_for'] = 'AST'
self.check_reduce['try_except'] = 'tokens'
self.check_reduce['tryelsestmt'] = 'AST'
self.check_reduce['tryelsestmtl'] = 'AST'
def reduce_is_invalid(self, rule, ast, tokens, first, last):
invalid = super(Python26Parser,
self).reduce_is_invalid(rule, ast,
tokens, first, last)
lhs = rule[0]
if invalid or tokens is None:
return invalid
if rule in (
@@ -386,6 +407,27 @@ class Python26Parser(Python2Parser):
return not (jmp_target == tokens[test_index].offset or
tokens[last].pattr == jmp_false.pattr)
elif lhs in ("forelselaststmtl", "forelsestmt"):
# print("XXX", first, last)
# for t in range(first, last):
# print(tokens[t])
# print("=" * 30)
# If the SETUP_LOOP jumps to the tokens[last] then
# this is a "for" not a "for else".
# However, in Python 2.2 and before there is a SET_LINENO
# instruction which might have gotten removed. So we need
# to account for that. bytecode-1.4/anydbm.pyc exhibits
# this behavior.
# Also we need to use the setup_loop instruction (not opcode)
# since the operand can be a relative offset rather than
# an absolute offset.
setup_inst = self.insts[self.offset2inst_index[tokens[first].offset]]
if self.version <= 2.2 and tokens[last] == "COME_FROM":
last += 1
return tokens[last-1].off2int() > setup_inst.argval
elif rule == ("ifstmt", ("testexpr", "_ifstmts_jump")):
for i in range(last-1, last-4, -1):
t = tokens[i]
@@ -402,7 +444,7 @@ class Python26Parser(Python2Parser):
# The JUMP_ABSOLUTE has to be to the last POP_TOP or this is invalid
ja_attr = ast[4].attr
return tokens[last].offset != ja_attr
elif rule[0] == 'try_except':
elif lhs == 'try_except':
# We need to distingush try_except from tryelsestmt and we do that
# by checking the jump before the END_FINALLY
# If we have:
@@ -419,12 +461,12 @@ class Python26Parser(Python2Parser):
last -= 1
if (tokens[last] == 'COME_FROM'
and tokens[last-1] == 'END_FINALLY'
and tokens[last-2] == 'POP_TOP'):
and tokens[last-2] == 'POP_TOP'):
# A jump of 2 is a jump around POP_TOP, END_FINALLY which
# would indicate try/else rather than try
return (tokens[last-3].kind not in frozenset(('JUMP_FORWARD', 'RETURN_VALUE'))
or (tokens[last-3] == 'JUMP_FORWARD' and tokens[last-3].attr != 2))
elif rule[0] == 'tryelsestmt':
elif lhs == 'tryelsestmt':
# We need to distingush try_except from tryelsestmt and we do that
# by making sure that the jump before the except handler jumps to

View File

@@ -1,4 +1,4 @@
# Copyright (c) 2016-2019 Rocky Bernstein
# Copyright (c) 2016-2020 Rocky Bernstein
# Copyright (c) 2005 by Dan Pascu <dan@windowmaker.org>
# Copyright (c) 2000-2002 by hartmut Goebel <hartmut@goebel.noris.de>
@@ -7,9 +7,10 @@ from xdis import next_offset
from uncompyle6.parser import PythonParserSingle, nop_func
from uncompyle6.parsers.parse2 import Python2Parser
from uncompyle6.parsers.reducecheck import (
aug_assign1_check,
or_check,
ifelsestmt,
tryelsestmt,
except_handler,
)
class Python27Parser(Python2Parser):
@@ -92,7 +93,7 @@ class Python27Parser(Python2Parser):
iflaststmtl ::= testexpr c_stmts
_ifstmts_jump ::= c_stmts_opt JUMP_FORWARD come_froms
bp_come_from ::= POP_BLOCK COME_FROM
pb_come_from ::= POP_BLOCK COME_FROM
# FIXME: Common with 3.0+
jmp_false ::= POP_JUMP_IF_FALSE
@@ -164,7 +165,7 @@ class Python27Parser(Python2Parser):
while1elsestmt ::= SETUP_LOOP l_stmts JUMP_BACK
else_suitel COME_FROM
while1stmt ::= SETUP_LOOP returns bp_come_from
while1stmt ::= SETUP_LOOP returns pb_come_from
while1stmt ::= SETUP_LOOP l_stmts_opt JUMP_BACK POP_BLOCK COME_FROM
whilestmt ::= SETUP_LOOP testexpr l_stmts_opt JUMP_BACK POP_BLOCK _come_froms
@@ -231,12 +232,15 @@ class Python27Parser(Python2Parser):
# FIXME: Put more in this table
self.reduce_check_table = {
# "ifelsestmt": ifelsestmt,
"aug_assign1": aug_assign1_check,
"except_handler": except_handler,
"or": or_check,
"tryelsestmt": tryelsestmt,
"tryelsestmtl": tryelsestmt,
}
self.check_reduce["and"] = "AST"
self.check_reduce["aug_assign1"] = "AST"
self.check_reduce["if_exp"] = "AST"
self.check_reduce["except_handler"] = "tokens"

View File

@@ -45,6 +45,7 @@ class Python38Parser(Python37Parser):
stmt ::= try_elsestmtl38
stmt ::= try_except_ret38
stmt ::= try_except38
stmt ::= try_except_as
stmt ::= whilestmt38
stmt ::= whileTruestmt38
stmt ::= call_stmt
@@ -133,6 +134,8 @@ class Python38Parser(Python37Parser):
except_cond1 ::= DUP_TOP expr COMPARE_OP jmp_false
POP_TOP POP_TOP POP_TOP
POP_EXCEPT
except_cond_as ::= DUP_TOP expr COMPARE_OP POP_JUMP_IF_FALSE
POP_TOP STORE_FAST POP_TOP
try_elsestmtl38 ::= SETUP_FINALLY suite_stmts_opt POP_BLOCK
except_handler38 COME_FROM
@@ -145,6 +148,10 @@ class Python38Parser(Python37Parser):
# suite_stmts has a return
try_except38 ::= SETUP_FINALLY POP_BLOCK suite_stmts
except_handler38b
try_except_as ::= SETUP_FINALLY POP_BLOCK suite_stmts
except_handler_as END_FINALLY COME_FROM
try_except_as ::= SETUP_FINALLY suite_stmts
except_handler_as END_FINALLY COME_FROM
try_except_ret38 ::= SETUP_FINALLY returns except_ret38a
try_except_ret38a ::= SETUP_FINALLY returns except_handler38c
@@ -165,6 +172,11 @@ class Python38Parser(Python37Parser):
except_handler38a ::= COME_FROM_FINALLY POP_TOP POP_TOP POP_TOP
POP_EXCEPT POP_TOP stmts END_FINALLY
except_handler38c ::= COME_FROM_FINALLY except_cond1a except_stmts
POP_EXCEPT JUMP_FORWARD COME_FROM
except_handler_as ::= COME_FROM_FINALLY except_cond_as tryfinallystmt
POP_EXCEPT JUMP_FORWARD COME_FROM
tryfinallystmt ::= SETUP_FINALLY suite_stmts_opt POP_BLOCK
BEGIN_FINALLY COME_FROM_FINALLY suite_stmts_opt
END_FINALLY

View File

@@ -1,4 +1,6 @@
from uncompyle6.parsers.reducecheck.and_check import *
from uncompyle6.parsers.reducecheck.aug_assign import *
from uncompyle6.parsers.reducecheck.except_handler import *
from uncompyle6.parsers.reducecheck.except_handler_else import *
from uncompyle6.parsers.reducecheck.ifelsestmt import *
from uncompyle6.parsers.reducecheck.iflaststmt import *

View File

@@ -0,0 +1,10 @@
# Copyright (c) 2020 Rocky Bernstein
def aug_assign1_check(self, lhs, n, rule, ast, tokens, first, last):
# print("XXX", first, last, rule)
# for t in range(first, last): print(tokens[t])
# print("="*40)
expr = ast[0]
return expr == "expr" and expr[0] == "or"
return False

View File

@@ -0,0 +1,19 @@
# Copyright (c) 2020 Rocky Bernstein
def except_handler(self, lhs, n, rule, ast, tokens, first, last):
end_token = tokens[last-1]
# print("XXX", first, last)
# for t in range(first, last):
# print(tokens[t])
# print("=" * 30)
# FIXME: Figure out why this doesn't work on
# bytecode-1.4/anydbm.pyc
if self.version == 1.4:
return False
# Make sure come froms all come from within "except_handler".
if end_token != "COME_FROM":
return False
return end_token.attr < tokens[first].offset

View File

@@ -105,6 +105,24 @@ IFELSE_STMT_RULES = frozenset(
"opt_come_from_except",
),
),
(
"ifelsestmt",
(
"testexpr",
"stmts",
"jf_cfs",
"\\e_else_suite_opt",
"\\e_opt_come_from_except")
),
(
"ifelsestmt",
(
"testexpr",
"stmts",
"jf_cfs",
"\\e_else_suite_opt",
"opt_come_from_except")
),
])
def ifelsestmt(self, lhs, n, rule, ast, tokens, first, last):
@@ -113,6 +131,11 @@ def ifelsestmt(self, lhs, n, rule, ast, tokens, first, last):
# ifelsestmt jumped outside of loop. No good.
return True
# print("XXX", first, last)
# for t in range(first, last):
# print(tokens[t])
# print("=" * 30)
if rule not in IFELSE_STMT_RULES:
# print("XXX", rule)
return False
@@ -186,9 +209,7 @@ def ifelsestmt(self, lhs, n, rule, ast, tokens, first, last):
if jump_else_end == "jf_cf_pop":
jump_else_end = jump_else_end[0]
jump_to_jump = False
if jump_else_end == "JUMP_FORWARD":
jump_to_jump = True
endif_target = int(jump_else_end.pattr)
last_offset = tokens[last].off2int()
if endif_target != last_offset:

View File

@@ -6,7 +6,13 @@ def tryelsestmt(self, lhs, n, rule, ast, tokens, first, last):
# Check the end of the except handler that there isn't a jump from
# inside the except handler to the end. If that happens
# then this is a "try" with no "else".
# for t in range(first, last):
# print(tokens[t])
# print("=" * 30)
except_handler = ast[3]
if except_handler == "except_handler_else":
except_handler = except_handler[0]
if except_handler == "except_handler":
@@ -32,5 +38,8 @@ def tryelsestmt(self, lhs, n, rule, ast, tokens, first, last):
except_handler_first_offset = leading_jump.first_child().off2int()
else:
except_handler_first_offset = leading_jump.off2int()
if first_come_from.attr < tokens[first].offset:
return True
return first_come_from.attr > except_handler_first_offset
return False

View File

@@ -142,7 +142,7 @@ def code_deparse_align(co, out=sys.stderr, version=None, is_pypy=None,
is_pypy = is_pypy)
isTopLevel = co.co_name == '<module>'
deparsed.ast = deparsed.build_ast(tokens, customize, isTopLevel=isTopLevel)
deparsed.ast = deparsed.build_ast(tokens, customize, co, isTopLevel=isTopLevel)
assert deparsed.ast == 'stmts', 'Should have parsed grammar start'

View File

@@ -142,17 +142,12 @@ PASS = SyntaxTree(
)
ASSIGN_DOC_STRING = lambda doc_string, doc_load: SyntaxTree(
"stmt",
"assign",
[
SyntaxTree(
"assign",
[
SyntaxTree(
"expr", [Token(doc_load, pattr=doc_string, attr=doc_string)]
),
SyntaxTree("store", [Token("STORE_NAME", pattr="__doc__")]),
],
)
"expr", [Token(doc_load, pattr=doc_string, attr=doc_string)]
),
SyntaxTree("store", [Token("STORE_NAME", pattr="__doc__")]),
],
)

View File

@@ -90,7 +90,7 @@ def customize_for_version3(self, version):
code_obj = node[1].attr
assert iscode(code_obj)
code = Code(code_obj, self.scanner, self.currentclass)
ast = self.build_ast(code._tokens, code._customize)
ast = self.build_ast(code._tokens, code._customize, code)
self.customize(code._customize)
# skip over: sstmt, stmt, return, ret_expr

View File

@@ -50,12 +50,24 @@ def customize_for_version38(self, version):
"%|%c\n", 0
),
"except_cond_as": (
"%|except %c as %c:\n",
(1, "expr"),
(-2, "STORE_FAST"),
),
'except_handler38': (
'%c', (2, 'except_stmts') ),
'except_handler38a': (
'%c', (-2, 'stmts') ),
"except_handler_as": (
"%c%+\n%+%c%-",
(1, "except_cond_as"),
(2, "tryfinallystmt"),
),
'except_ret38a': (
'return %c', (4, 'expr') ),
@@ -105,6 +117,13 @@ def customize_for_version38(self, version):
'try_except38': (
'%|try:\n%+%c\n%-%|except:\n%|%-%c\n\n',
(-2, 'suite_stmts_opt'), (-1, 'except_handler38a') ),
"try_except_as": (
"%|try:\n%+%c%-\n%|%-%c\n\n",
(-4, "suite_stmts"), # Go from the end because of POP_BLOCK variation
(-3, "except_handler_as"),
),
"try_except_ret38": (
"%|try:\n%+%c%-\n%|except:\n%+%|%c%-\n\n",
(1, "returns"),

View File

@@ -681,7 +681,7 @@ class FragmentsWalker(pysource.SourceWalker, object):
assert iscode(cn.attr)
code = Code(cn.attr, self.scanner, self.currentclass)
ast = self.build_ast(code._tokens, code._customize)
ast = self.build_ast(code._tokens, code._customize, code)
self.customize(code._customize)
ast = ast[0][0][0]
@@ -728,7 +728,7 @@ class FragmentsWalker(pysource.SourceWalker, object):
code_name = code.co_name
code = Code(code, self.scanner, self.currentclass)
ast = self.build_ast(code._tokens, code._customize)
ast = self.build_ast(code._tokens, code._customize, code)
self.customize(code._customize)
if ast[0] == "sstmt":
@@ -850,7 +850,7 @@ class FragmentsWalker(pysource.SourceWalker, object):
self.prec = 27
code = Code(node[1].attr, self.scanner, self.currentclass)
ast = self.build_ast(code._tokens, code._customize)
ast = self.build_ast(code._tokens, code._customize, code)
self.customize(code._customize)
if node == "set_comp":
ast = ast[0][0][0]
@@ -992,7 +992,7 @@ class FragmentsWalker(pysource.SourceWalker, object):
self.prec = 27
code = Code(node[1].attr, self.scanner, self.currentclass)
ast = self.build_ast(code._tokens, code._customize)
ast = self.build_ast(code._tokens, code._customize, code)
self.customize(code._customize)
ast = ast[0][0][0]
store = ast[3]
@@ -1142,9 +1142,8 @@ class FragmentsWalker(pysource.SourceWalker, object):
self.name = old_name
self.return_none = rn
def build_ast(
self, tokens, customize, is_lambda=False, noneInNames=False, isTopLevel=False
):
def build_ast(self, tokens, customize, code, is_lambda=False,
noneInNames=False, isTopLevel=False):
# FIXME: DRY with pysource.py

View File

@@ -100,6 +100,7 @@ def make_function2(self, node, is_lambda, nested=1, code_node=None):
ast = self.build_ast(
code._tokens,
code._customize,
code,
is_lambda=is_lambda,
noneInNames=("None" in code.co_names),
)

View File

@@ -132,6 +132,7 @@ def make_function3_annotate(
ast = self.build_ast(
code._tokens,
code._customize,
code,
is_lambda=is_lambda,
noneInNames=("None" in code.co_names),
)
@@ -491,6 +492,7 @@ def make_function3(self, node, is_lambda, nested=1, code_node=None):
ast = self.build_ast(
scanner_code._tokens,
scanner_code._customize,
scanner_code,
is_lambda=is_lambda,
noneInNames=("None" in code.co_names),
)

View File

@@ -173,6 +173,7 @@ def make_function36(self, node, is_lambda, nested=1, code_node=None):
ast = self.build_ast(
scanner_code._tokens,
scanner_code._customize,
scanner_code,
is_lambda=is_lambda,
noneInNames=("None" in code.co_names),
)

View File

@@ -164,7 +164,6 @@ from uncompyle6.semantics.consts import (
NONE,
RETURN_NONE,
PASS,
ASSIGN_DOC_STRING,
NAME_MODULE,
TAB,
INDENT_PER_LEVEL,
@@ -588,8 +587,7 @@ class SourceWalker(GenericASTTraversal, object):
if n == "LOAD_CONST" and repr(n.pattr)[0] == "-":
self.prec = 6
# print(n.kind, p, "<", self.prec)
# print(self.f.getvalue())
# print("XXX", n.kind, p, "<", self.prec)
if p < self.prec:
self.write("(")
@@ -605,7 +603,6 @@ class SourceWalker(GenericASTTraversal, object):
# If expr is yield we want parens.
self.prec = PRECEDENCE["yield"] - 1
self.n_expr(node[0])
p = self.prec
else:
self.n_expr(node)
@@ -895,6 +892,12 @@ class SourceWalker(GenericASTTraversal, object):
doc_node = node[0]
if doc_node.attr:
docstring = doc_node.attr
if not isinstance(docstring, str):
# FIXME: we have mistakenly tagged something as a doc
# string in transform when it isn't one.
# The rule in n_mkfunc is pretty flaky.
self.prune()
return
else:
docstring = node[0].pattr
@@ -1114,7 +1117,7 @@ class SourceWalker(GenericASTTraversal, object):
assert iscode(cn.attr)
code = Code(cn.attr, self.scanner, self.currentclass)
ast = self.build_ast(code._tokens, code._customize)
ast = self.build_ast(code._tokens, code._customize, code)
self.customize(code._customize)
# Remove single reductions as in ("stmts", "sstmt"):
@@ -1198,7 +1201,7 @@ class SourceWalker(GenericASTTraversal, object):
assert iscode(code), node[code_index]
code = Code(code, self.scanner, self.currentclass)
ast = self.build_ast(code._tokens, code._customize)
ast = self.build_ast(code._tokens, code._customize, code)
self.customize(code._customize)
# skip over: sstmt, stmt, return, ret_expr
@@ -1393,7 +1396,7 @@ class SourceWalker(GenericASTTraversal, object):
self.prec = 27
code = Code(node[1].attr, self.scanner, self.currentclass)
ast = self.build_ast(code._tokens, code._customize)
ast = self.build_ast(code._tokens, code._customize, code)
self.customize(code._customize)
# Remove single reductions as in ("stmts", "sstmt"):
@@ -2307,7 +2310,7 @@ class SourceWalker(GenericASTTraversal, object):
indent = self.indent
# self.println(indent, '#flags:\t', int(code.co_flags))
ast = self.build_ast(code._tokens, code._customize)
ast = self.build_ast(code._tokens, code._customize, code)
code._tokens = None # save memory
assert ast == "stmts"
@@ -2318,6 +2321,7 @@ class SourceWalker(GenericASTTraversal, object):
if ast[0] == "docstring":
self.println(self.traverse(ast[0]))
del ast[0]
first_stmt = ast[0]
if 3.0 <= self.version <= 3.3:
try:
@@ -2381,10 +2385,10 @@ class SourceWalker(GenericASTTraversal, object):
# if docstring exists, dump it
if code.co_consts and code.co_consts[0] is not None and len(ast) > 0:
do_doc = False
if is_docstring(ast[0]):
if is_docstring(ast[0], self.version, code.co_consts):
i = 0
do_doc = True
elif len(ast) > 1 and is_docstring(ast[1]):
elif len(ast) > 1 and is_docstring(ast[1], self.version, code.co_consts):
i = 1
do_doc = True
if do_doc and self.hide_internal:
@@ -2460,7 +2464,7 @@ class SourceWalker(GenericASTTraversal, object):
self.return_none = rn
def build_ast(
self, tokens, customize, is_lambda=False, noneInNames=False, isTopLevel=False
self, tokens, customize, code, is_lambda=False, noneInNames=False, isTopLevel=False
):
# FIXME: DRY with fragments.py
@@ -2480,13 +2484,13 @@ class SourceWalker(GenericASTTraversal, object):
p_insts = self.p.insts
self.p.insts = self.scanner.insts
self.p.offset2inst_index = self.scanner.offset2inst_index
ast = python_parser.parse(self.p, tokens, customize)
ast = python_parser.parse(self.p, tokens, customize, code)
self.customize(customize)
self.p.insts = p_insts
except (python_parser.ParserError, AssertionError) as e:
raise ParserError(e, tokens, self.p.debug['reduce'])
transform_ast = self.treeTransform.transform(ast)
transform_ast = self.treeTransform.transform(ast, code)
self.maybe_show_tree(ast)
del ast # Save memory
return transform_ast
@@ -2518,7 +2522,7 @@ class SourceWalker(GenericASTTraversal, object):
self.p.insts = self.scanner.insts
self.p.offset2inst_index = self.scanner.offset2inst_index
self.p.opc = self.scanner.opc
ast = python_parser.parse(self.p, tokens, customize)
ast = python_parser.parse(self.p, tokens, customize, code)
self.p.insts = p_insts
except (python_parser.ParserError, AssertionError) as e:
raise ParserError(e, tokens, self.p.debug['reduce'])
@@ -2526,7 +2530,7 @@ class SourceWalker(GenericASTTraversal, object):
checker(ast, False, self.ast_errors)
self.customize(customize)
transform_ast = self.treeTransform.transform(ast)
transform_ast = self.treeTransform.transform(ast, code)
self.maybe_show_tree(ast)
@@ -2588,7 +2592,7 @@ def code_deparse(
)
isTopLevel = co.co_name == "<module>"
deparsed.ast = deparsed.build_ast(tokens, customize, isTopLevel=isTopLevel)
deparsed.ast = deparsed.build_ast(tokens, customize, co, isTopLevel=isTopLevel)
#### XXX workaround for profiling
if deparsed.ast is None:

View File

@@ -13,7 +13,6 @@
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
from xdis import iscode
from uncompyle6.show import maybe_show_tree
from copy import copy
from spark_parser import GenericASTTraversal, GenericASTTraversalPruningException
@@ -21,16 +20,39 @@ from spark_parser import GenericASTTraversal, GenericASTTraversalPruningExceptio
from uncompyle6.semantics.helper import find_code_node
from uncompyle6.parsers.treenode import SyntaxTree
from uncompyle6.scanners.tok import NoneToken, Token
from uncompyle6.semantics.consts import RETURN_NONE
from uncompyle6.semantics.consts import RETURN_NONE, ASSIGN_DOC_STRING
def is_docstring(node):
def is_docstring(node, version, co_consts):
if node == "sstmt":
node = node[0]
try:
return node.kind == "assign" and node[1][0].pattr == "__doc__"
except:
return False
# TODO: the test below on 2.7 succeeds for
# class OldClass:
# __doc__ = DocDescr()
# which produces:
#
# assign (2)
# 0. expr
# call (2)
# 0. expr
# L. 16 6 LOAD_DEREF 0 'DocDescr'
# 1. 9 CALL_FUNCTION_0 0 None
# 1. store
#
# See Python 2.7 test_descr.py
# If ASSIGN_DOC_STRING doesn't work we need something like the below
# but more elaborate to address the above.
# try:
# return node.kind == "assign" and node[1][0].pattr == "__doc__"
# except:
# return False
if version <= 2.7:
doc_load = "LOAD_CONST"
else:
doc_load = "LOAD_STR"
return node == ASSIGN_DOC_STRING(co_consts[0], doc_load)
def is_not_docstring(call_stmt_node):
@@ -95,10 +117,18 @@ class TreeTransform(GenericASTTraversal, object):
code = find_code_node(node, code_index).attr
mkfunc_pattr = node[-1].pattr
if isinstance(mkfunc_pattr, tuple):
assert len(mkfunc_pattr, 4) and isinstance(mkfunc_pattr, int)
is_closure = node[-1].pattr[3] != 0
else:
# FIXME: This is what we had before. It is hoaky and probably wrong.
is_closure = mkfunc_pattr == "closure"
if (
node[-1].pattr != "closure"
(not is_closure)
and len(code.co_consts) > 0
and code.co_consts[0] is not None
and isinstance(code.co_consts[0], str)
):
docstring_node = SyntaxTree(
"docstring", [Token("LOAD_STR", has_arg=True, pattr=code.co_consts[0])]
@@ -409,7 +439,7 @@ class TreeTransform(GenericASTTraversal, object):
node = self.preorder(node)
return node
def transform(self, ast):
def transform(self, ast, code):
self.maybe_show_tree(ast)
self.ast = copy(ast)
self.ast = self.traverse(self.ast, is_lambda=False)
@@ -430,9 +460,10 @@ class TreeTransform(GenericASTTraversal, object):
for i in range(len(self.ast)):
sstmt = ast[i]
if len(sstmt) == 1 and sstmt == "sstmt":
ast[i] = ast[i][0]
self.ast[i] = self.ast[i][0]
if is_docstring(self.ast[i]):
if is_docstring(self.ast[i], self.version, code.co_consts):
load_const = self.ast[i].first_child()
docstring_ast = SyntaxTree(
"docstring",
[
@@ -440,8 +471,8 @@ class TreeTransform(GenericASTTraversal, object):
"LOAD_STR",
has_arg=True,
offset=0,
attr=self.ast[i][0][0].attr,
pattr=self.ast[i][0][0].pattr,
attr=load_const.attr,
pattr=load_const.pattr,
)
],
)

View File

@@ -41,7 +41,7 @@ def better_repr(v, version):
else:
return s
elif isinstance(v, list):
l = better_repr(v)
better_repr(v)
if len(v) == 1:
return "[%s,]" % better_repr(v[0], version)
return "[%s]" % ", ".join(better_repr(i) for i in v)

View File

@@ -12,4 +12,4 @@
# along with this program. If not, see <http://www.gnu.org/licenses/>.
# This file is suitable for sourcing inside POSIX shell as
# well as importing into Python
VERSION="3.7.1" # noqa
VERSION="3.7.3" # noqa