Current Path: > > opt > alt > python311 > lib64 > python3.11 > lib2to3 > pgen2 > __pycache__
Operation : Linux premium131.web-hosting.com 4.18.0-553.44.1.lve.el8.x86_64 #1 SMP Thu Mar 13 14:29:12 UTC 2025 x86_64 Software : Apache Server IP : 162.0.232.56 | Your IP: 216.73.216.111 Domains : 1034 Domain(s) Permission : [ 0755 ]
Name | Type | Size | Last Modified | Actions |
---|---|---|---|---|
__init__.cpython-311.opt-1.pyc | File | 204 bytes | June 23 2025 15:48:01. | |
__init__.cpython-311.opt-2.pyc | File | 168 bytes | June 23 2025 15:48:05. | |
__init__.cpython-311.pyc | File | 204 bytes | June 23 2025 15:48:01. | |
conv.cpython-311.opt-1.pyc | File | 11077 bytes | June 23 2025 15:48:03. | |
conv.cpython-311.opt-2.pyc | File | 8617 bytes | June 23 2025 15:48:05. | |
conv.cpython-311.pyc | File | 13261 bytes | June 23 2025 15:48:01. | |
driver.cpython-311.opt-1.pyc | File | 8738 bytes | June 23 2025 15:48:03. | |
driver.cpython-311.opt-2.pyc | File | 7788 bytes | June 23 2025 15:48:05. | |
driver.cpython-311.pyc | File | 8822 bytes | June 23 2025 15:48:01. | |
grammar.cpython-311.opt-1.pyc | File | 7571 bytes | June 23 2025 15:48:01. | |
grammar.cpython-311.opt-2.pyc | File | 4445 bytes | June 23 2025 15:48:05. | |
grammar.cpython-311.pyc | File | 7571 bytes | June 23 2025 15:48:01. | |
literals.cpython-311.opt-1.pyc | File | 2476 bytes | June 23 2025 15:48:03. | |
literals.cpython-311.opt-2.pyc | File | 2400 bytes | June 23 2025 15:48:05. | |
literals.cpython-311.pyc | File | 3090 bytes | June 23 2025 15:48:01. | |
parse.cpython-311.opt-1.pyc | File | 9019 bytes | June 23 2025 15:48:03. | |
parse.cpython-311.opt-2.pyc | File | 5832 bytes | June 23 2025 15:48:05. | |
parse.cpython-311.pyc | File | 9046 bytes | June 23 2025 15:48:01. | |
pgen.cpython-311.opt-1.pyc | File | 19051 bytes | June 23 2025 15:48:03. | |
pgen.cpython-311.opt-2.pyc | File | 19051 bytes | June 23 2025 15:48:03. | |
pgen.cpython-311.pyc | File | 20269 bytes | June 23 2025 15:48:01. | |
token.cpython-311.opt-1.pyc | File | 2379 bytes | June 23 2025 15:48:01. | |
token.cpython-311.opt-2.pyc | File | 2330 bytes | June 23 2025 15:48:05. | |
token.cpython-311.pyc | File | 2379 bytes | June 23 2025 15:48:01. | |
tokenize.cpython-311.opt-1.pyc | File | 24043 bytes | June 23 2025 15:48:03. | |
tokenize.cpython-311.opt-2.pyc | File | 20177 bytes | June 23 2025 15:48:05. | |
tokenize.cpython-311.pyc | File | 24163 bytes | June 23 2025 15:48:01. |
� !A?hR � �P � d Z dZdZddlZddlZddlmZmZ ddlT ddl m Z d � ee � � D � � g d �z Z[ e n # e$ r eZ Y nw xY wd� Zd� Zd � Zd� ZdZdZe edez � � z ee� � z ZdZdZdZdZ edd� � Z eeeee� � ZdZ edd� � ee� � z Zdez Z eee� � Z ede dz � � Z! ee!e e� � Z"dZ#dZ$d Z%d!Z&d"Z' ee'd#z e'd$z � � Z( ee'd%z e'd&z � � Z) ed'd(d)d*d+d,d-d.d/� � Z*d0Z+ ed1d2d3� � Z, ee*e+e,� � Z- ee"e-e)e� � Z.ee.z Z/ ee'd4z ed5d� � z e'd6z ed7d� � z � � Z0 edee(� � Z1e ee1e"e-e0e� � z Z2 e3ej4 e/e2e%e&f� � \ Z5Z6Z7Z8 ed8d9d:d;� � ed8d9d<d=� � z h d>�z Z9 ej4 e#� � ej4 e$� � e7e8d?�d@� e9D � � �dA� e9D � � �dB� e9D � � �Z:d#d$hdC� e9D � � z dD� e9D � � z Z;d5d7hdE� e9D � � z dF� e9D � � z Z<dGZ= G dH� dIe>� � Z? G dJ� dKe>� � Z@dL� ZAeAfdM�ZBdN� ZC G dO� dP� � ZD ej4 dQejE � � ZF ej4 dRejE � � ZGdS� ZHdT� ZIdU� ZJdV� ZKeLdWk rUddlMZM eNeMjO � � dk r& eB ePeMjO d � � jQ � � dS eBeMjR jQ � � dS dS )Xa� Tokenization help for Python programs. generate_tokens(readline) is a generator that breaks a stream of text into Python tokens. It accepts a readline-like method which is called repeatedly to get the next line of input (or "" for EOF). It generates 5-tuples with these members: the token type (see token.py) the token (a string) the starting (row, column) indices of the token (a 2-tuple of ints) the ending (row, column) indices of the token (a 2-tuple of ints) the original line (string) It is designed to match the working of the Python tokenizer exactly, except that it produces COMMENT tokens for comments and gives type OP for all operators Older entry points tokenize_loop(readline, tokeneater) tokenize(readline, tokeneater=printtoken) are the same, except instead of generating tokens, tokeneater is a callback function to which the 5 fields described above are passed as 5 arguments, each time a new token is found.zKa-Ping Yee <ping@lfw.org>z@GvR, ESR, Tim Peters, Thomas Wouters, Fred Drake, Skip Montanaro� N)�BOM_UTF8�lookup)�*� )�tokenc �* � g | ]}|d dk �|��S )r �_� )�.0�xs �=/opt/alt/python311/lib64/python3.11/lib2to3/pgen2/tokenize.py� <listcomp>r % s! � � 0� 0� 0��A�a�D�C�K�K�1�K�K�K� )�tokenize�generate_tokens� untokenizec �8 � dd� | � � z dz S )N�(�|�))�join��choicess r �groupr 0 s � �C�#�(�(�7�"3�"3�3�c�9�9r c � � t | � dz S )Nr �r r s r �anyr 1 s � �%��/�C�/�/r c � � t | � dz S )N�?r r s r �mayber 2 s � �E�7�O�c�1�1r c �: � � t � fd�� D � � � � S )Nc 3 � �K � | ];}�d z D ]3}|� � � |� � � k �,||z V � �4�<dS ))� N)�casefold)r r �y�ls �r � <genexpr>z _combinations.<locals>.<genexpr>4 s` �� � � � � ��!�e�)�� ��q�z�z�|�|�q�z�z�|�|�/K�/K��A��/K�/K�/K�/K�/K�� r )�set)r&