Current Path: > > opt > alt > python312 > lib64 > python3.12 > lib2to3 > pgen2 > __pycache__
Operation : Linux premium131.web-hosting.com 4.18.0-553.44.1.lve.el8.x86_64 #1 SMP Thu Mar 13 14:29:12 UTC 2025 x86_64 Software : Apache Server IP : 162.0.232.56 | Your IP: 216.73.216.111 Domains : 1034 Domain(s) Permission : [ 0755 ]
Name | Type | Size | Last Modified | Actions |
---|---|---|---|---|
__init__.cpython-312.opt-1.pyc | File | 193 bytes | June 23 2025 14:00:19. | |
__init__.cpython-312.opt-2.pyc | File | 163 bytes | June 23 2025 14:00:28. | |
__init__.cpython-312.pyc | File | 193 bytes | June 23 2025 14:00:19. | |
conv.cpython-312.opt-1.pyc | File | 9919 bytes | June 23 2025 14:00:24. | |
conv.cpython-312.opt-2.pyc | File | 7469 bytes | June 23 2025 14:00:28. | |
conv.cpython-312.pyc | File | 11773 bytes | June 23 2025 14:00:19. | |
driver.cpython-312.opt-1.pyc | File | 8042 bytes | June 23 2025 14:00:24. | |
driver.cpython-312.opt-2.pyc | File | 7095 bytes | June 23 2025 14:00:28. | |
driver.cpython-312.pyc | File | 8112 bytes | June 23 2025 14:00:19. | |
grammar.cpython-312.opt-1.pyc | File | 7018 bytes | June 23 2025 14:00:19. | |
grammar.cpython-312.opt-2.pyc | File | 3902 bytes | June 23 2025 14:00:28. | |
grammar.cpython-312.pyc | File | 7018 bytes | June 23 2025 14:00:19. | |
literals.cpython-312.opt-1.pyc | File | 2146 bytes | June 23 2025 14:00:24. | |
literals.cpython-312.opt-2.pyc | File | 2073 bytes | June 23 2025 14:00:28. | |
literals.cpython-312.pyc | File | 2607 bytes | June 23 2025 14:00:19. | |
parse.cpython-312.opt-1.pyc | File | 8715 bytes | June 23 2025 14:00:24. | |
parse.cpython-312.opt-2.pyc | File | 5542 bytes | June 23 2025 14:00:28. | |
parse.cpython-312.pyc | File | 8738 bytes | June 23 2025 14:00:19. | |
pgen.cpython-312.opt-1.pyc | File | 17731 bytes | June 23 2025 14:00:24. | |
pgen.cpython-312.opt-2.pyc | File | 17731 bytes | June 23 2025 14:00:24. | |
pgen.cpython-312.pyc | File | 18718 bytes | June 23 2025 14:00:19. | |
token.cpython-312.opt-1.pyc | File | 2258 bytes | June 23 2025 14:00:19. | |
token.cpython-312.opt-2.pyc | File | 2212 bytes | June 23 2025 14:00:28. | |
token.cpython-312.pyc | File | 2258 bytes | June 23 2025 14:00:19. | |
tokenize.cpython-312.opt-1.pyc | File | 20824 bytes | June 23 2025 14:00:24. | |
tokenize.cpython-312.opt-2.pyc | File | 16965 bytes | June 23 2025 14:00:28. | |
tokenize.cpython-312.pyc | File | 20951 bytes | June 23 2025 14:00:19. |
� B[YhR � � � d Z dZdZddlZddlZddlmZmZ ddl� ddl m Z ee � D � cg c] } | d d k7 s�| �� c} g d �z Z[ e d� Zd� Zd � Zd� ZdZdZe edez � z ee� z ZdZdZdZdZ edd� Z eeeee� ZdZ edd� ee� z Zdez Z eee� Z ede dz � Z! ee!e e� Z"dZ#dZ$d Z%d!Z&d"Z' ee'd#z e'd$z � Z( ee'd%z e'd&z � Z) ed'd(d)d*d+d,d-d.d/� Z*d0Z+ ed1d2d3� Z, ee*e+e,� Z- ee"e-e)e� Z.ee.z Z/ ee'd4z ed5d� z e'd6z ed7d� z � Z0 edee(� Z1e ee1e"e-e0e� z Z2 e3ejh e/e2e%e&f� \ Z5Z6Z7Z8 ed8d9d:d;� ed8d9d<d=� z h d>�z Z9 ejh e#� ejh e$� e7e8d?�e9D �ci c] }|� d#�e7�� c}�e9D �ci c] }|� d$�e8�� c}�e9D �ci c] }|d�� c}�Z:d#d$he9D �ch c] }|� d#��� c}z e9D �ch c] }|� d$��� c}z Z;d5d7he9D �ch c] }|� d5��� c}z e9D �ch c] }|� d7��� c}z Z<d@Z= G dA� dBe>� Z? G dC� dDe>� Z@dE� ZAeAfdF�ZBdG� ZC G dH� dI� ZD ejh dJej� � ZF ejh dKej� � ZGdL� ZHdM� ZIdN� ZJdO� ZKeLdPk( r\ddlMZM eNeMj� � dkD r& eB ePeMj� d � j� � y eBeMj� j� � yyc c} w # e$ r eZ Y ���w xY wc c}w c c}w c c}w c c}w c c}w c c}w c c}w )Qa� Tokenization help for Python programs. generate_tokens(readline) is a generator that breaks a stream of text into Python tokens. It accepts a readline-like method which is called repeatedly to get the next line of input (or "" for EOF). It generates 5-tuples with these members: the token type (see token.py) the token (a string) the starting (row, column) indices of the token (a 2-tuple of ints) the ending (row, column) indices of the token (a 2-tuple of ints) the original line (string) It is designed to match the working of the Python tokenizer exactly, except that it produces COMMENT tokens for comments and gives type OP for all operators Older entry points tokenize_loop(readline, tokeneater) tokenize(readline, tokeneater=printtoken) are the same, except instead of generating tokens, tokeneater is a callback function to which the 5 fields described above are passed as 5 arguments, each time a new token is found.zKa-Ping Yee <ping@lfw.org>z@GvR, ESR, Tim Peters, Thomas Wouters, Fred Drake, Skip Montanaro� N)�BOM_UTF8�lookup)�*� )�token�_)�tokenize�generate_tokens� untokenizec �0 � ddj | � z dz S )N�(�|�))�join��choicess �=/opt/alt/python312/lib64/python3.12/lib2to3/pgen2/tokenize.py�groupr 0 s � �C�#�(�(�7�"3�3�c�9�9� c � � t | � dz S )Nr �r r s r �anyr 1 s � �%��/�C�/�/r c � � t | � dz S )N�?r r s r �mayber 2 s � �E�7�O�c�1�1r c �, � � t � fd�� D � � S )Nc 3 � �K � | ]5 }�d z D ]+ }|j � |j � k7 s�%||z �� �- �7 y�w))� N)�casefold)�.0�x�y�ls �r � <genexpr>z _combinations.<locals>.<genexpr>4 s8 �� �� � ��!�!�e�)�Q�q�z�z�|�q�z�z�|�/K��A��)��q�s �.>�>)�set)r# s `r � _combinationsr&