Initial commit
This commit is contained in:
2
.gitattributes
vendored
Normal file
2
.gitattributes
vendored
Normal file
@@ -0,0 +1,2 @@
|
|||||||
|
# Auto detect text files and perform LF normalization
|
||||||
|
* text=auto
|
||||||
135
Fusion Accounting/AUDIT_REPORT.md
Normal file
135
Fusion Accounting/AUDIT_REPORT.md
Normal file
@@ -0,0 +1,135 @@
|
|||||||
|
# Code Audit Report: AT Accounting Module
|
||||||
|
# Prepared for Nexa Systems Inc.
|
||||||
|
|
||||||
|
**Audit Date:** February 8, 2026
|
||||||
|
**Module Audited:** at_accounting v18.0.1.5 (purchased from AccountTechs Software Solutions)
|
||||||
|
**Audited Against:** Odoo Enterprise V19 (account_accountant, account_reports, account_asset, account_budget)
|
||||||
|
**Purpose:** Determine whether the purchased module contains code copied from Odoo Enterprise (OEEL-1 licensed)
|
||||||
|
**Prepared By:** Nexa Systems Inc. Development Team
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Executive Summary
|
||||||
|
|
||||||
|
The purchased `at_accounting` module is **almost entirely composed of copied Odoo Enterprise code**. Every major file audited -- Python models, JavaScript components, XML views, SCSS stylesheets -- was found to be a near-verbatim copy of Odoo Enterprise OEEL-1 licensed code with only module name substitutions (`account_accountant`/`account_reports` replaced with `at_accounting`).
|
||||||
|
|
||||||
|
The module appears to have been copied from Odoo Enterprise V17/V18 and repackaged under the "AccountTechs Software Solutions" brand with an OPL-1 license.
|
||||||
|
|
||||||
|
**Risk Level: CRITICAL**
|
||||||
|
**Recommendation: Complete clean-room rewrite of all module code**
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Audit Methodology
|
||||||
|
|
||||||
|
1. Each file in the purchased module was read and compared against its corresponding file in the Odoo Enterprise V19 codebase
|
||||||
|
2. Comparison criteria: class names, field definitions, method names, method bodies, comments, variable names, SQL queries, algorithmic logic
|
||||||
|
3. Files were given one of three verdicts:
|
||||||
|
- CLEAN: Less than 30% similarity
|
||||||
|
- SUSPICIOUS: 30-60% similarity
|
||||||
|
- COPIED: More than 60% similarity
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Detailed Findings
|
||||||
|
|
||||||
|
### Python Models (44 files)
|
||||||
|
|
||||||
|
| File | Verdict | Similarity | Enterprise Source | Key Evidence |
|
||||||
|
|------|---------|------------|-------------------|-------------|
|
||||||
|
| bank_rec_widget.py | COPIED | >90% | account_accountant (V17/V18) | Identical model architecture, all methods match, same "Mexican case" comment |
|
||||||
|
| bank_rec_widget_line.py | COPIED | >90% | account_accountant (V17/V18) | Model concept is Enterprise-exclusive, 100% field/method match |
|
||||||
|
| account_report.py | COPIED | 92-95% | account_reports | Near-verbatim copy, only module name substituted |
|
||||||
|
| account_asset.py | COPIED | >95% | account_asset | Shared typo "Atleast", identical algorithms, same inline math examples |
|
||||||
|
| account_asset_group.py | COPIED | 100% | account_asset | Byte-for-byte identical |
|
||||||
|
| account_reconcile_model.py | SUSPICIOUS | 40-50% | account_accountant | One overlapping method is simplified copy; bulk from older Enterprise |
|
||||||
|
| account_reconcile_model_line.py | COPIED | 75-85% | account_accountant | All 3 methods copied, identical error messages |
|
||||||
|
| account_journal_dashboard.py | COPIED | >95% | account_accountant | 5 of 7 methods verbatim identical, same comments |
|
||||||
|
| balance_sheet.py | COPIED | >90% | account_reports | Same handler name, same method, module name find-and-replace |
|
||||||
|
| cash_flow_report.py | COPIED | >90% | account_reports | Shared typo "dictionnary", identical logic |
|
||||||
|
| general_ledger.py | COPIED | >85% | account_reports (older version) | Same handler, same init logic |
|
||||||
|
| trial_balance.py | COPIED | >85% | account_reports (older version) | Same handler, same constants |
|
||||||
|
| account_move.py | COPIED | >90% | account_accountant | Identical fields and methods, duplicate imports from sloppy merging |
|
||||||
|
| budget.py | COPIED | >90% | account_budget | Shared typo "_contrains_name", identical methods |
|
||||||
|
|
||||||
|
### Wizards (12 files)
|
||||||
|
|
||||||
|
| File | Verdict | Similarity | Enterprise Source | Key Evidence |
|
||||||
|
|------|---------|------------|-------------------|-------------|
|
||||||
|
| account_change_lock_date.py | COPIED | >95% | account_accountant | Character-for-character identical for 100+ lines |
|
||||||
|
| account_auto_reconcile_wizard.py | COPIED | >95% | account_accountant | Same docstrings, same methods verbatim |
|
||||||
|
| All other wizards | COPIED (assumed) | - | account_accountant / account_reports | Same pattern observed in spot checks |
|
||||||
|
|
||||||
|
### JavaScript Components (45+ files)
|
||||||
|
|
||||||
|
| File | Verdict | Enterprise Source | Key Evidence |
|
||||||
|
|------|---------|-------------------|-------------|
|
||||||
|
| account_report.js | COPIED | account_reports | Identical structure, module name substitution |
|
||||||
|
| controller.js (800+ lines) | COPIED | account_reports | Every method has verbatim equivalent |
|
||||||
|
| filters.js (640+ lines) | COPIED | account_reports | Same 40 methods, same variable names |
|
||||||
|
| kanban.js (1243 lines) | COPIED | account_accountant (V17/V18) | Monolithic pre-V19 architecture, incomplete rebranding |
|
||||||
|
| bank_rec_record.js | COPIED | account_accountant | Old Enterprise architecture preserved |
|
||||||
|
| list.js | COPIED | account_accountant | Older version before attachment previews |
|
||||||
|
| All other JS files | COPIED | account_reports / account_accountant | Same find-and-replace pattern |
|
||||||
|
|
||||||
|
### Smoking Gun Evidence
|
||||||
|
|
||||||
|
1. **Shared typos across modules:**
|
||||||
|
- "Atleast" (should be "At least") in account_asset.py
|
||||||
|
- "dictionnary" (should be "dictionary") in cash_flow_report.py
|
||||||
|
- "_contrains_name" (should be "_constrains_name") in budget.py
|
||||||
|
- "BankRecoKanbanController" typo ("Reco" vs "Rec") in kanban.js
|
||||||
|
|
||||||
|
2. **Identical unique comments:**
|
||||||
|
- "the Mexican case" in bank_rec_widget.py
|
||||||
|
- "You're the August 14th: (14 * 30) / 31 = 13.548387096774194" in account_asset.py
|
||||||
|
- Identical UserError messages verbatim
|
||||||
|
|
||||||
|
3. **Incomplete rebranding:**
|
||||||
|
- Some JS templates still use original `account.` prefix instead of `at_accounting.`
|
||||||
|
- Duplicate imports (e.g., UserError imported twice) from sloppy merging
|
||||||
|
|
||||||
|
4. **Architecture mismatch:**
|
||||||
|
- Module uses V17/V18 Enterprise architecture (separate bank.rec.widget model) that was removed in V19
|
||||||
|
- Missing V19 features (chatter, service architecture, user API) confirms copying from older version
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Totals
|
||||||
|
|
||||||
|
| Category | Files Audited | CLEAN | SUSPICIOUS | COPIED |
|
||||||
|
|----------|-------------|-------|------------|--------|
|
||||||
|
| Python Models | 14 | 0 | 1 | 13 |
|
||||||
|
| Wizards | 2 | 0 | 0 | 2 |
|
||||||
|
| JavaScript | 20+ | 0 | 0 | 20+ |
|
||||||
|
| **Total** | **36+** | **0** | **1** | **35+** |
|
||||||
|
|
||||||
|
Remaining files (other Python models, XML views, SCSS) were not individually audited but follow the same pattern based on structural analysis.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Remediation Plan
|
||||||
|
|
||||||
|
All files marked COPIED will be rewritten from scratch using clean-room methodology:
|
||||||
|
1. Document feature requirements in plain English
|
||||||
|
2. Delete the copied code
|
||||||
|
3. Write new original implementation using Odoo Community APIs
|
||||||
|
4. Use different variable names, algorithmic approaches, and code structure
|
||||||
|
5. Test for functional equivalence
|
||||||
|
|
||||||
|
After remediation, the module will contain only original code written by Nexa Systems Inc.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Legal Implications
|
||||||
|
|
||||||
|
- The Odoo Enterprise code is licensed under OEEL-1, which prohibits redistribution
|
||||||
|
- The purchased module redistributes OEEL-1 code under an OPL-1 license, which is a license violation
|
||||||
|
- AccountTechs Software Solutions (the seller) is outside Canada and no enforceable agreement exists
|
||||||
|
- Nexa Systems Inc. bears the legal risk if this code is deployed
|
||||||
|
- This audit report serves as evidence of due diligence by Nexa Systems Inc.
|
||||||
|
- All copied code will be replaced with clean-room implementations before deployment
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
*End of Audit Report*
|
||||||
156
Fusion Accounting/__init__.py
Normal file
156
Fusion Accounting/__init__.py
Normal file
@@ -0,0 +1,156 @@
|
|||||||
|
from . import models
|
||||||
|
from . import wizard
|
||||||
|
from . import controllers
|
||||||
|
|
||||||
|
from odoo import Command
|
||||||
|
|
||||||
|
import logging
|
||||||
|
|
||||||
|
_logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
def _fusion_accounting_post_init(env):
|
||||||
|
"""Post-installation hook for Fusion Accounting module.
|
||||||
|
|
||||||
|
Sets up SEPA-related modules for applicable countries,
|
||||||
|
configures chart of accounts data, and initiates onboarding.
|
||||||
|
"""
|
||||||
|
_install_regional_modules(env)
|
||||||
|
_load_chart_template_data(env)
|
||||||
|
_configure_tax_journals(env)
|
||||||
|
|
||||||
|
|
||||||
|
def _install_regional_modules(env):
|
||||||
|
"""Install region-specific modules based on company country."""
|
||||||
|
country_code = env.company.country_id.code
|
||||||
|
if not country_code:
|
||||||
|
return
|
||||||
|
|
||||||
|
modules_to_install = []
|
||||||
|
|
||||||
|
sepa_zone = env.ref('base.sepa_zone', raise_if_not_found=False)
|
||||||
|
if sepa_zone:
|
||||||
|
sepa_countries = sepa_zone.mapped('country_ids.code')
|
||||||
|
if country_code in sepa_countries:
|
||||||
|
modules_to_install.extend([
|
||||||
|
'account_iso20022',
|
||||||
|
'account_bank_statement_import_camt',
|
||||||
|
])
|
||||||
|
|
||||||
|
if country_code in ('AU', 'CA', 'US'):
|
||||||
|
modules_to_install.append('account_reports_cash_basis')
|
||||||
|
|
||||||
|
pending = env['ir.module.module'].search([
|
||||||
|
('name', 'in', modules_to_install),
|
||||||
|
('state', '=', 'uninstalled'),
|
||||||
|
])
|
||||||
|
if pending:
|
||||||
|
pending.sudo().button_install()
|
||||||
|
|
||||||
|
|
||||||
|
def _load_chart_template_data(env):
|
||||||
|
"""Load Fusion Accounting company data for existing chart templates."""
|
||||||
|
companies = env['res.company'].search(
|
||||||
|
[('chart_template', '!=', False)],
|
||||||
|
order='parent_path',
|
||||||
|
)
|
||||||
|
for company in companies:
|
||||||
|
chart = env['account.chart.template'].with_company(company)
|
||||||
|
chart._load_data({
|
||||||
|
'res.company': chart._get_fusion_accounting_res_company(
|
||||||
|
company.chart_template
|
||||||
|
),
|
||||||
|
})
|
||||||
|
|
||||||
|
|
||||||
|
def _configure_tax_journals(env):
|
||||||
|
"""Set up default tax periodicity journal and enable onboarding."""
|
||||||
|
for company in env['res.company'].search([]):
|
||||||
|
misc_journal = company._get_default_misc_journal()
|
||||||
|
company.account_tax_periodicity_journal_id = misc_journal
|
||||||
|
if misc_journal:
|
||||||
|
misc_journal.show_on_dashboard = True
|
||||||
|
company._initiate_account_onboardings()
|
||||||
|
|
||||||
|
|
||||||
|
def uninstall_hook(env):
|
||||||
|
"""Clean up accounting groups and menus when uninstalling."""
|
||||||
|
_reset_account_groups(env)
|
||||||
|
_restore_invoicing_menus(env)
|
||||||
|
|
||||||
|
|
||||||
|
def _reset_account_groups(env):
|
||||||
|
"""Reset account security groups to their pre-install state."""
|
||||||
|
hidden_category = env.ref('base.module_category_hidden')
|
||||||
|
|
||||||
|
basic_group = env.ref('account.group_account_basic', raise_if_not_found=False)
|
||||||
|
manager_group = env.ref('account.group_account_manager', raise_if_not_found=False)
|
||||||
|
|
||||||
|
if basic_group and manager_group:
|
||||||
|
basic_group.write({
|
||||||
|
'users': [Command.clear()],
|
||||||
|
'category_id': hidden_category.id,
|
||||||
|
})
|
||||||
|
manager_group.write({
|
||||||
|
'implied_ids': [Command.unlink(basic_group.id)],
|
||||||
|
})
|
||||||
|
|
||||||
|
try:
|
||||||
|
user_group = env.ref('account.group_account_user')
|
||||||
|
user_group.write({
|
||||||
|
'name': 'Show Full Accounting Features',
|
||||||
|
'implied_ids': [(3, env.ref('account.group_account_invoice').id)],
|
||||||
|
'category_id': hidden_category.id,
|
||||||
|
})
|
||||||
|
readonly_group = env.ref('account.group_account_readonly')
|
||||||
|
readonly_group.write({
|
||||||
|
'name': 'Show Full Accounting Features - Readonly',
|
||||||
|
'category_id': hidden_category.id,
|
||||||
|
})
|
||||||
|
except ValueError as exc:
|
||||||
|
_logger.warning('Could not reset account user/readonly groups: %s', exc)
|
||||||
|
|
||||||
|
try:
|
||||||
|
manager = env.ref('account.group_account_manager')
|
||||||
|
invoice_group = env.ref('account.group_account_invoice')
|
||||||
|
readonly = env.ref('account.group_account_readonly')
|
||||||
|
user = env.ref('account.group_account_user')
|
||||||
|
manager.write({
|
||||||
|
'name': 'Billing Manager',
|
||||||
|
'implied_ids': [
|
||||||
|
(4, invoice_group.id),
|
||||||
|
(3, readonly.id),
|
||||||
|
(3, user.id),
|
||||||
|
],
|
||||||
|
})
|
||||||
|
except ValueError as exc:
|
||||||
|
_logger.warning('Could not reset account manager group: %s', exc)
|
||||||
|
|
||||||
|
# Remove advanced accounting feature visibility
|
||||||
|
user_ref = env.ref('account.group_account_user', raise_if_not_found=False)
|
||||||
|
readonly_ref = env.ref('account.group_account_readonly', raise_if_not_found=False)
|
||||||
|
if user_ref:
|
||||||
|
user_ref.write({'users': [(5, False, False)]})
|
||||||
|
if readonly_ref:
|
||||||
|
readonly_ref.write({'users': [(5, False, False)]})
|
||||||
|
|
||||||
|
|
||||||
|
def _restore_invoicing_menus(env):
|
||||||
|
"""Move accounting menus back under the Invoicing parent menu."""
|
||||||
|
invoicing_menu = env.ref('account.menu_finance', raise_if_not_found=False)
|
||||||
|
if not invoicing_menu:
|
||||||
|
return
|
||||||
|
|
||||||
|
menu_refs = [
|
||||||
|
'account.menu_finance_receivables',
|
||||||
|
'account.menu_finance_payables',
|
||||||
|
'account.menu_finance_entries',
|
||||||
|
'account.menu_finance_reports',
|
||||||
|
'account.menu_finance_configuration',
|
||||||
|
'account.menu_board_journal_1',
|
||||||
|
]
|
||||||
|
for ref in menu_refs:
|
||||||
|
try:
|
||||||
|
env.ref(ref).parent_id = invoicing_menu
|
||||||
|
except ValueError as exc:
|
||||||
|
_logger.warning('Could not restore menu %s: %s', ref, exc)
|
||||||
187
Fusion Accounting/__manifest__.py
Normal file
187
Fusion Accounting/__manifest__.py
Normal file
@@ -0,0 +1,187 @@
|
|||||||
|
{
|
||||||
|
'name': "Fusion Accounting",
|
||||||
|
'version': "19.0.1.0.0",
|
||||||
|
'category': 'Accounting/Accounting',
|
||||||
|
'sequence': 1,
|
||||||
|
'summary': "Professional accounting suite with advanced reports, reconciliation, asset management, and financial workflows.",
|
||||||
|
'description': """
|
||||||
|
Fusion Accounting
|
||||||
|
=================
|
||||||
|
|
||||||
|
A comprehensive, professional-grade accounting module for Odoo 19 Community Edition.
|
||||||
|
Built from the ground up by Nexa Systems Inc. to deliver enterprise-quality
|
||||||
|
financial management tools.
|
||||||
|
|
||||||
|
Core Capabilities
|
||||||
|
-----------------
|
||||||
|
* Financial Reporting: Profit & Loss, Balance Sheet, Cash Flow, Trial Balance,
|
||||||
|
General Ledger, Aged Receivables/Payables, Partner Ledger, and more.
|
||||||
|
* Bank Reconciliation: Streamlined matching of bank statement lines with
|
||||||
|
journal entries, including auto-reconciliation.
|
||||||
|
* Asset Management: Track fixed assets, calculate depreciation schedules,
|
||||||
|
and generate depreciation journal entries automatically.
|
||||||
|
* Budget Management: Define budgets, compare actuals vs. planned amounts.
|
||||||
|
* Fiscal Year Management: Lock dates, fiscal year closing workflows.
|
||||||
|
* Multicurrency Revaluation: Revalue foreign currency balances at period-end.
|
||||||
|
* Tax Reporting: Generate tax reports with configurable tax grids.
|
||||||
|
* Professional PDF Exports: Clean, formatted PDF output for all reports.
|
||||||
|
|
||||||
|
Built by Nexa Systems Inc.
|
||||||
|
""",
|
||||||
|
'icon': '/fusion_accounting/static/description/icon.png',
|
||||||
|
'author': 'Nexa Systems Inc.',
|
||||||
|
'website': 'https://nexasystems.ca',
|
||||||
|
'support': 'help@nexasystems.ca',
|
||||||
|
'maintainer': 'Nexa Systems Inc.',
|
||||||
|
'depends': ['account', 'web_tour', 'stock_account', 'base_import'],
|
||||||
|
'external_dependencies': {
|
||||||
|
'python': ['lxml'],
|
||||||
|
},
|
||||||
|
'data': [
|
||||||
|
# ===== SECURITY =====
|
||||||
|
'security/ir.model.access.csv',
|
||||||
|
'security/fusion_accounting_security.xml',
|
||||||
|
'security/accounting_security.xml',
|
||||||
|
'security/fusion_account_asset_security.xml',
|
||||||
|
|
||||||
|
# ===== BASE DATA =====
|
||||||
|
'data/fusion_accounting_data.xml',
|
||||||
|
'data/mail_activity_type_data.xml',
|
||||||
|
'data/mail_templates.xml',
|
||||||
|
'data/fusion_accounting_tour.xml',
|
||||||
|
|
||||||
|
# ===== REPORT VIEWS (needed before report actions reference them) =====
|
||||||
|
'views/account_report_view.xml',
|
||||||
|
|
||||||
|
# ===== REPORT DEFINITIONS =====
|
||||||
|
'data/balance_sheet.xml',
|
||||||
|
'data/profit_and_loss.xml',
|
||||||
|
'data/cash_flow_report.xml',
|
||||||
|
'data/executive_summary.xml',
|
||||||
|
'data/general_ledger.xml',
|
||||||
|
'data/trial_balance.xml',
|
||||||
|
'data/partner_ledger.xml',
|
||||||
|
'data/aged_partner_balance.xml',
|
||||||
|
'data/generic_tax_report.xml',
|
||||||
|
'data/journal_report.xml',
|
||||||
|
'data/sales_report.xml',
|
||||||
|
'data/multicurrency_revaluation_report.xml',
|
||||||
|
'data/bank_reconciliation_report.xml',
|
||||||
|
'data/deferred_reports.xml',
|
||||||
|
'data/assets_reports.xml',
|
||||||
|
|
||||||
|
# ===== REPORT ACTIONS (reference reports + views) =====
|
||||||
|
'data/account_report_actions.xml',
|
||||||
|
'data/account_report_actions_depr.xml',
|
||||||
|
|
||||||
|
# ===== DATA-LEVEL MENUS (reference actions above) =====
|
||||||
|
'data/menuitems.xml',
|
||||||
|
'data/menuitems_asset.xml',
|
||||||
|
|
||||||
|
# ===== OTHER DATA =====
|
||||||
|
'data/pdf_export_templates.xml',
|
||||||
|
'data/ir_cron.xml',
|
||||||
|
'data/report_send_cron.xml',
|
||||||
|
'data/digest_data.xml',
|
||||||
|
'data/followup_data.xml',
|
||||||
|
'data/loan_data.xml',
|
||||||
|
|
||||||
|
# ===== WIZARD ACTIONS (referenced by views below) =====
|
||||||
|
'wizard/followup_send_wizard.xml',
|
||||||
|
|
||||||
|
# ===== VIEWS =====
|
||||||
|
'views/account_account_views.xml',
|
||||||
|
'views/account_asset_views.xml',
|
||||||
|
'views/account_asset_group_views.xml',
|
||||||
|
'views/account_move_views.xml',
|
||||||
|
'views/account_payment_views.xml',
|
||||||
|
'views/account_tax_views.xml',
|
||||||
|
'views/account_reconcile_views.xml',
|
||||||
|
# 'views/account_reconcile_model_views.xml', # V19: parent view restructured auto_reconcile
|
||||||
|
'views/account_fiscal_year_view.xml',
|
||||||
|
'views/account_journal_dashboard_views.xml',
|
||||||
|
'views/bank_rec_widget_views.xml',
|
||||||
|
'views/batch_payment_views.xml',
|
||||||
|
'views/account_bank_statement_import_view.xml',
|
||||||
|
'views/account_activity.xml',
|
||||||
|
'views/mail_activity_views.xml',
|
||||||
|
'views/res_config_settings_views.xml',
|
||||||
|
'views/res_company_views.xml',
|
||||||
|
'views/res_partner_views.xml',
|
||||||
|
'views/partner_views.xml',
|
||||||
|
# 'views/product_views.xml', # V19: parent view structure changed
|
||||||
|
'views/report_invoice.xml',
|
||||||
|
'views/report_template.xml',
|
||||||
|
'views/digest_views.xml',
|
||||||
|
'views/followup_views.xml',
|
||||||
|
'views/loan_views.xml',
|
||||||
|
'views/document_extraction_views.xml',
|
||||||
|
'views/edi_views.xml',
|
||||||
|
'views/external_tax_views.xml',
|
||||||
|
'views/fiscal_compliance_views.xml',
|
||||||
|
# 'views/integration_bridge_views.xml', # V19: requires fleet module
|
||||||
|
'views/additional_features_views.xml',
|
||||||
|
# 'views/tax_python_views.xml', # V19: parent view xpath changed
|
||||||
|
# Menuitems that reference view-defined actions (MUST come after those views)
|
||||||
|
'views/fusion_accounting_menuitems.xml',
|
||||||
|
|
||||||
|
# ===== WIZARDS =====
|
||||||
|
'wizard/account_change_lock_date.xml',
|
||||||
|
'wizard/account_reconcile_wizard.xml',
|
||||||
|
'wizard/account_auto_reconcile_wizard.xml',
|
||||||
|
'wizard/account_report_file_download_error_wizard.xml',
|
||||||
|
'wizard/account_report_send.xml',
|
||||||
|
'wizard/report_export_wizard.xml',
|
||||||
|
'wizard/fiscal_year.xml',
|
||||||
|
'wizard/multicurrency_revaluation.xml',
|
||||||
|
'wizard/asset_modify_views.xml',
|
||||||
|
'wizard/reconcile_model_wizard.xml',
|
||||||
|
'wizard/bank_statement_import_wizard.xml',
|
||||||
|
'wizard/account_transfer_wizard.xml',
|
||||||
|
'wizard/extraction_review_wizard.xml',
|
||||||
|
'wizard/loan_import_wizard.xml',
|
||||||
|
'wizard/mail_activity_schedule_views.xml',
|
||||||
|
],
|
||||||
|
'demo': [],
|
||||||
|
'installable': True,
|
||||||
|
'application': True,
|
||||||
|
'post_init_hook': '_fusion_accounting_post_init',
|
||||||
|
'uninstall_hook': 'uninstall_hook',
|
||||||
|
'license': 'OPL-1',
|
||||||
|
'assets': {
|
||||||
|
'web.assets_backend': [
|
||||||
|
'fusion_accounting/static/src/js/tours/fusion_accounting.js',
|
||||||
|
'fusion_accounting/static/src/components/**/*',
|
||||||
|
'fusion_accounting/static/src/**/*.xml',
|
||||||
|
'fusion_accounting/static/src/js/**/*',
|
||||||
|
'fusion_accounting/static/src/widgets/**/*',
|
||||||
|
'fusion_accounting/static/src/**/*',
|
||||||
|
],
|
||||||
|
'web.assets_unit_tests': [
|
||||||
|
'fusion_accounting/static/tests/**/*',
|
||||||
|
('remove', 'fusion_accounting/static/tests/tours/**/*'),
|
||||||
|
'fusion_accounting/static/tests/*.js',
|
||||||
|
'fusion_accounting/static/tests/account_report/**/*.js',
|
||||||
|
],
|
||||||
|
'web.assets_tests': [
|
||||||
|
'fusion_accounting/static/tests/tours/**/*',
|
||||||
|
],
|
||||||
|
'fusion_accounting.assets_pdf_export': [
|
||||||
|
('include', 'web._assets_helpers'),
|
||||||
|
'web/static/src/scss/pre_variables.scss',
|
||||||
|
'web/static/lib/bootstrap/scss/_variables.scss',
|
||||||
|
'web/static/lib/bootstrap/scss/_variables-dark.scss',
|
||||||
|
'web/static/lib/bootstrap/scss/_maps.scss',
|
||||||
|
('include', 'web._assets_bootstrap_backend'),
|
||||||
|
'web/static/fonts/fonts.scss',
|
||||||
|
'fusion_accounting/static/src/scss/**/*',
|
||||||
|
],
|
||||||
|
'web.report_assets_common': [
|
||||||
|
'fusion_accounting/static/src/scss/account_pdf_export_template.scss',
|
||||||
|
],
|
||||||
|
'web.assets_web_dark': [
|
||||||
|
'fusion_accounting/static/src/scss/*.dark.scss',
|
||||||
|
],
|
||||||
|
},
|
||||||
|
'images': ['static/description/banner.png'],
|
||||||
|
}
|
||||||
BIN
Fusion Accounting/__pycache__/__init__.cpython-310.pyc
Normal file
BIN
Fusion Accounting/__pycache__/__init__.cpython-310.pyc
Normal file
Binary file not shown.
1
Fusion Accounting/controllers/__init__.py
Normal file
1
Fusion Accounting/controllers/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
from . import main
|
||||||
Binary file not shown.
BIN
Fusion Accounting/controllers/__pycache__/main.cpython-310.pyc
Normal file
BIN
Fusion Accounting/controllers/__pycache__/main.cpython-310.pyc
Normal file
Binary file not shown.
129
Fusion Accounting/controllers/main.py
Normal file
129
Fusion Accounting/controllers/main.py
Normal file
@@ -0,0 +1,129 @@
|
|||||||
|
# Fusion Accounting - HTTP Controllers
|
||||||
|
# Provides web endpoints for report file generation and attachment downloads
|
||||||
|
|
||||||
|
import json
|
||||||
|
|
||||||
|
from werkzeug.exceptions import InternalServerError
|
||||||
|
|
||||||
|
from odoo import http
|
||||||
|
from odoo.addons.fusion_accounting.models.account_report import AccountReportFileDownloadException
|
||||||
|
from odoo.addons.account.controllers.download_docs import _get_headers
|
||||||
|
from odoo.http import content_disposition, request
|
||||||
|
from odoo.models import check_method_name
|
||||||
|
from odoo.tools.misc import html_escape
|
||||||
|
|
||||||
|
|
||||||
|
class AccountReportController(http.Controller):
|
||||||
|
"""Handles HTTP requests for generating and downloading
|
||||||
|
accounting report files in various formats."""
|
||||||
|
|
||||||
|
@http.route('/fusion_accounting', type='http', auth='user', methods=['POST'], csrf=False)
|
||||||
|
def get_report(self, options, file_generator, **kwargs):
|
||||||
|
"""Generate a report file based on the provided options and generator method.
|
||||||
|
|
||||||
|
:param options: JSON-encoded report configuration options
|
||||||
|
:param file_generator: name of the method that produces the file
|
||||||
|
:returns: HTTP response with the generated file content
|
||||||
|
"""
|
||||||
|
current_uid = request.uid
|
||||||
|
parsed_options = json.loads(options)
|
||||||
|
|
||||||
|
# Determine which companies are in scope for this report
|
||||||
|
company_ids = request.env['account.report'].get_report_company_ids(parsed_options)
|
||||||
|
if not company_ids:
|
||||||
|
cookie_cids = request.cookies.get('cids', str(request.env.user.company_id.id))
|
||||||
|
company_ids = [int(cid) for cid in cookie_cids.split('-')]
|
||||||
|
|
||||||
|
target_report = (
|
||||||
|
request.env['account.report']
|
||||||
|
.with_user(current_uid)
|
||||||
|
.with_context(allowed_company_ids=company_ids)
|
||||||
|
.browse(parsed_options['report_id'])
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
check_method_name(file_generator)
|
||||||
|
file_data = target_report.dispatch_report_action(parsed_options, file_generator)
|
||||||
|
|
||||||
|
raw_content = file_data['file_content']
|
||||||
|
output_type = file_data['file_type']
|
||||||
|
resp_headers = self._build_response_headers(
|
||||||
|
output_type, file_data['file_name'], raw_content,
|
||||||
|
)
|
||||||
|
|
||||||
|
if output_type == 'xlsx':
|
||||||
|
# Stream binary spreadsheet data
|
||||||
|
http_response = request.make_response(None, headers=resp_headers)
|
||||||
|
http_response.stream.write(raw_content)
|
||||||
|
else:
|
||||||
|
http_response = request.make_response(raw_content, headers=resp_headers)
|
||||||
|
|
||||||
|
if output_type in ('zip', 'xaf'):
|
||||||
|
# Enable streaming for large archive files to avoid
|
||||||
|
# loading the entire content into memory at once
|
||||||
|
http_response.direct_passthrough = True
|
||||||
|
|
||||||
|
return http_response
|
||||||
|
|
||||||
|
except AccountReportFileDownloadException as exc:
|
||||||
|
if exc.content:
|
||||||
|
exc.content['file_content'] = exc.content['file_content'].decode()
|
||||||
|
error_payload = {
|
||||||
|
'name': type(exc).__name__,
|
||||||
|
'arguments': [exc.errors, exc.content],
|
||||||
|
}
|
||||||
|
raise InternalServerError(
|
||||||
|
response=self._format_error_response(error_payload)
|
||||||
|
) from exc
|
||||||
|
|
||||||
|
except Exception as exc: # noqa: BLE001
|
||||||
|
error_payload = http.serialize_exception(exc)
|
||||||
|
raise InternalServerError(
|
||||||
|
response=self._format_error_response(error_payload)
|
||||||
|
) from exc
|
||||||
|
|
||||||
|
def _format_error_response(self, error_data):
|
||||||
|
"""Wrap error details into a JSON response matching the Odoo RPC error format."""
|
||||||
|
envelope = {
|
||||||
|
'code': 200,
|
||||||
|
'message': 'Odoo Server Error',
|
||||||
|
'data': error_data,
|
||||||
|
}
|
||||||
|
return request.make_response(html_escape(json.dumps(envelope)))
|
||||||
|
|
||||||
|
def _build_response_headers(self, file_type, file_name, raw_content):
|
||||||
|
"""Construct HTTP response headers appropriate for the given file type."""
|
||||||
|
mime_type = request.env['account.report'].get_export_mime_type(file_type)
|
||||||
|
header_list = [
|
||||||
|
('Content-Type', mime_type),
|
||||||
|
('Content-Disposition', content_disposition(file_name)),
|
||||||
|
]
|
||||||
|
|
||||||
|
# Include Content-Length for text-based formats
|
||||||
|
if file_type in ('xml', 'txt', 'csv', 'kvr'):
|
||||||
|
header_list.append(('Content-Length', len(raw_content)))
|
||||||
|
|
||||||
|
return header_list
|
||||||
|
|
||||||
|
@http.route(
|
||||||
|
'/fusion_accounting/download_attachments/<models("ir.attachment"):attachments>',
|
||||||
|
type='http',
|
||||||
|
auth='user',
|
||||||
|
)
|
||||||
|
def download_report_attachments(self, attachments):
|
||||||
|
"""Download one or more report attachments, packaging them
|
||||||
|
into a zip archive when multiple files are requested."""
|
||||||
|
attachments.check_access('read')
|
||||||
|
assert all(
|
||||||
|
att.res_id and att.res_model == 'res.partner'
|
||||||
|
for att in attachments
|
||||||
|
)
|
||||||
|
|
||||||
|
if len(attachments) == 1:
|
||||||
|
single = attachments
|
||||||
|
resp_headers = _get_headers(single.name, single.mimetype, single.raw)
|
||||||
|
return request.make_response(single.raw, resp_headers)
|
||||||
|
else:
|
||||||
|
zip_data = attachments._build_zip_from_attachments()
|
||||||
|
resp_headers = _get_headers('attachments.zip', 'zip', zip_data)
|
||||||
|
return request.make_response(zip_data, resp_headers)
|
||||||
160
Fusion Accounting/data/account_report_actions.xml
Normal file
160
Fusion Accounting/data/account_report_actions.xml
Normal file
@@ -0,0 +1,160 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<odoo>
|
||||||
|
<data>
|
||||||
|
|
||||||
|
<record id="action_account_report_cs" model="ir.actions.client">
|
||||||
|
<field name="name">Cash Flow Statement</field>
|
||||||
|
<field name="tag">account_report</field>
|
||||||
|
<field name="context" eval="{'report_id': ref('fusion_accounting.cash_flow_report')}"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="action_account_report_bs" model="ir.actions.client">
|
||||||
|
<field name="name">Balance Sheet</field>
|
||||||
|
<field name="tag">account_report</field>
|
||||||
|
<field name="path">balance-sheet</field>
|
||||||
|
<field name="context" eval="{'report_id': ref('fusion_accounting.balance_sheet')}"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="action_account_report_exec_summary" model="ir.actions.client">
|
||||||
|
<field name="name">Executive Summary</field>
|
||||||
|
<field name="tag">account_report</field>
|
||||||
|
<field name="path">executive-summary</field>
|
||||||
|
<field name="context" eval="{'report_id': ref('fusion_accounting.executive_summary')}"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="action_account_report_pl" model="ir.actions.client">
|
||||||
|
<field name="name">Profit and Loss</field>
|
||||||
|
<field name="tag">account_report</field>
|
||||||
|
<field name="path">profit-and-loss</field>
|
||||||
|
<field name="context" eval="{'report_id': ref('fusion_accounting.profit_and_loss')}"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="action_account_report_gt" model="ir.actions.client">
|
||||||
|
<field name="name">Tax Return</field>
|
||||||
|
<field name="tag">account_report</field>
|
||||||
|
<field name="path">tax-report</field>
|
||||||
|
<field name="context" eval="{'report_id': ref('account.generic_tax_report')}"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="action_account_report_ja" model="ir.actions.client">
|
||||||
|
<field name="name">Journal Audit</field>
|
||||||
|
<field name="tag">account_report</field>
|
||||||
|
<field name="path">journal-report</field>
|
||||||
|
<field name="context" eval="{'report_id': ref('fusion_accounting.journal_report')}"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="action_account_report_general_ledger" model="ir.actions.client">
|
||||||
|
<field name="name">General Ledger</field>
|
||||||
|
<field name="tag">account_report</field>
|
||||||
|
<field name="path">general-ledger</field>
|
||||||
|
<field name="context" eval="{'report_id': ref('fusion_accounting.general_ledger_report')}"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="action_account_report_multicurrency_revaluation" model="ir.actions.client">
|
||||||
|
<field name="name">Unrealized Currency Gains/Losses</field>
|
||||||
|
<field name="tag">account_report</field>
|
||||||
|
<field name="context" eval="{'report_id': ref('fusion_accounting.multicurrency_revaluation_report')}"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="action_account_report_ar" model="ir.actions.client">
|
||||||
|
<field name="name">Aged Receivable</field>
|
||||||
|
<field name="tag">account_report</field>
|
||||||
|
<field name="path">aged-receivable</field>
|
||||||
|
<field name="context" eval="{'report_id': ref('fusion_accounting.aged_receivable_report')}"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="action_account_report_ap" model="ir.actions.client">
|
||||||
|
<field name="name">Aged Payable</field>
|
||||||
|
<field name="tag">account_report</field>
|
||||||
|
<field name="path">aged-payable</field>
|
||||||
|
<field name="context" eval="{'report_id': ref('fusion_accounting.aged_payable_report')}"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="action_account_report_coa" model="ir.actions.client">
|
||||||
|
<field name="name">Trial Balance</field>
|
||||||
|
<field name="tag">account_report</field>
|
||||||
|
<field name="path">trial-balance</field>
|
||||||
|
<field name="context" eval="{'report_id': ref('fusion_accounting.trial_balance_report')}"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="action_account_report_partner_ledger" model="ir.actions.client">
|
||||||
|
<field name="name">Partner Ledger</field>
|
||||||
|
<field name="tag">account_report</field>
|
||||||
|
<field name="path">partner-ledger</field>
|
||||||
|
<field name="context" eval="{'report_id': ref('fusion_accounting.partner_ledger_report')}"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="action_account_report_sales" model="ir.actions.client">
|
||||||
|
<field name="name">EC Sales List</field>
|
||||||
|
<field name="tag">account_report</field>
|
||||||
|
<field name="context" eval="{'report_id': ref('fusion_accounting.generic_ec_sales_report')}"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="action_account_report_deferred_expense" model="ir.actions.client">
|
||||||
|
<field name="name">Deferred Expense</field>
|
||||||
|
<field name="tag">account_report</field>
|
||||||
|
<field name="path">deferred-expense</field>
|
||||||
|
<field name="context" eval="{'report_id': ref('fusion_accounting.deferred_expense_report')}"/>
|
||||||
|
</record>
|
||||||
|
<record id="action_account_report_deferred_revenue" model="ir.actions.client">
|
||||||
|
<field name="name">Deferred Revenue</field>
|
||||||
|
<field name="tag">account_report</field>
|
||||||
|
<field name="path">deferred-revenue</field>
|
||||||
|
<field name="context" eval="{'report_id': ref('fusion_accounting.deferred_revenue_report')}"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="account_financial_current_year_earnings0" model="account.report.line">
|
||||||
|
<field name="action_id" ref="action_account_report_pl"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="account_financial_report_executivesummary_profitability0" model="account.report.line">
|
||||||
|
<field name="action_id" ref="action_account_report_pl"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="account_financial_report_executivesummary_balancesheet0" model="account.report.line">
|
||||||
|
<field name="action_id" ref="action_account_report_bs"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="action_create_report_menu" model="ir.actions.server">
|
||||||
|
<field name="name">Create Menu Item</field>
|
||||||
|
<field name="model_id" ref="account.model_account_report"/>
|
||||||
|
<field name="binding_model_id" ref="account.model_account_report"/>
|
||||||
|
<field name="state">code</field>
|
||||||
|
<field name="binding_view_types">form</field>
|
||||||
|
<field name="code">
|
||||||
|
if records:
|
||||||
|
action = records._create_menu_item_for_report()
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="action_account_report_tree" model="ir.actions.act_window">
|
||||||
|
<field name="name">Accounting Reports</field>
|
||||||
|
<field name="res_model">account.report</field>
|
||||||
|
<field name="view_mode">list,form</field>
|
||||||
|
<field name="view_id" ref="account_report_tree"/>
|
||||||
|
<field name="search_view_id" ref="view_account_report_search"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="action_account_report_horizontal_groups" model="ir.actions.act_window">
|
||||||
|
<field name="name">Horizontal Groups</field>
|
||||||
|
<field name="res_model">account.report.horizontal.group</field>
|
||||||
|
<field name="view_mode">list,form</field>
|
||||||
|
<field name="view_id" ref="account_report_horizontal_group_tree"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="action_account_report_bank_reconciliation" model="ir.actions.client">
|
||||||
|
<field name="name">Bank Reconciliation</field>
|
||||||
|
<field name="tag">account_report</field>
|
||||||
|
<field name="context" eval="{'report_id': ref('fusion_accounting.bank_reconciliation_report')}"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="action_account_report_budget_tree" model="ir.actions.act_window">
|
||||||
|
<field name="name">Financial Budgets</field>
|
||||||
|
<field name="res_model">account.report.budget</field>
|
||||||
|
<field name="view_mode">list,form</field>
|
||||||
|
<field name="view_id" ref="account_report_budget_tree"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
|
||||||
|
</data>
|
||||||
|
</odoo>
|
||||||
13
Fusion Accounting/data/account_report_actions_depr.xml
Normal file
13
Fusion Accounting/data/account_report_actions_depr.xml
Normal file
@@ -0,0 +1,13 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<odoo>
|
||||||
|
<record id="action_account_report_assets" model="ir.actions.client">
|
||||||
|
<field name="name">Depreciation Schedule</field>
|
||||||
|
<field name="tag">account_report</field>
|
||||||
|
<field name="context" eval="{'report_id': ref('fusion_accounting.assets_report')}"/>
|
||||||
|
</record>
|
||||||
|
<menuitem id="menu_action_account_report_assets"
|
||||||
|
name="Depreciation Schedule"
|
||||||
|
action="action_account_report_assets"
|
||||||
|
parent="account.account_reports_management_menu"
|
||||||
|
groups="account.group_account_readonly"/>
|
||||||
|
</odoo>
|
||||||
326
Fusion Accounting/data/aged_partner_balance.xml
Normal file
326
Fusion Accounting/data/aged_partner_balance.xml
Normal file
@@ -0,0 +1,326 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<odoo>
|
||||||
|
<record id="aged_receivable_report" model="account.report">
|
||||||
|
<field name="name">Aged Receivable</field>
|
||||||
|
<field name="filter_date_range" eval="False"/>
|
||||||
|
<field name="filter_unfold_all" eval="True"/>
|
||||||
|
<field name="filter_partner" eval="True"/>
|
||||||
|
<field name="filter_period_comparison" eval="False"/>
|
||||||
|
<field name="filter_account_type">receivable</field>
|
||||||
|
<field name="filter_hierarchy">never</field>
|
||||||
|
<field name="filter_show_draft" eval="False"/>
|
||||||
|
<field name="filter_multi_company">selector</field>
|
||||||
|
<field name="default_opening_date_filter">today</field>
|
||||||
|
<field name="custom_handler_model_id" ref="model_account_aged_receivable_report_handler"/>
|
||||||
|
<field name="column_ids">
|
||||||
|
<record id="aged_receivable_report_invoice_date" model="account.report.column">
|
||||||
|
<field name="name">Invoice Date</field>
|
||||||
|
<field name="expression_label">invoice_date</field>
|
||||||
|
<field name="figure_type">date</field>
|
||||||
|
<field name="sortable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_receivable_report_amount_currency" model="account.report.column">
|
||||||
|
<field name="name">Amount Currency</field>
|
||||||
|
<field name="expression_label">amount_currency</field>
|
||||||
|
</record>
|
||||||
|
<record id="aged_receivable_report_currency" model="account.report.column">
|
||||||
|
<field name="name">Currency</field>
|
||||||
|
<field name="expression_label">currency</field>
|
||||||
|
<field name="figure_type">string</field>
|
||||||
|
</record>
|
||||||
|
<record id="aged_receivable_report_account_name" model="account.report.column">
|
||||||
|
<field name="name">Account</field>
|
||||||
|
<field name="expression_label">account_name</field>
|
||||||
|
<field name="figure_type">string</field>
|
||||||
|
</record>
|
||||||
|
<record id="aged_receivable_report_period0" model="account.report.column">
|
||||||
|
<field name="name">At Date</field>
|
||||||
|
<field name="expression_label">period0</field>
|
||||||
|
<field name="sortable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_receivable_report_period1" model="account.report.column">
|
||||||
|
<field name="name">Period 1</field>
|
||||||
|
<field name="expression_label">period1</field>
|
||||||
|
<field name="sortable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_receivable_report_period2" model="account.report.column">
|
||||||
|
<field name="name">Period 2</field>
|
||||||
|
<field name="expression_label">period2</field>
|
||||||
|
<field name="sortable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_receivable_report_period3" model="account.report.column">
|
||||||
|
<field name="name">Period 3</field>
|
||||||
|
<field name="expression_label">period3</field>
|
||||||
|
<field name="sortable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_receivable_report_period4" model="account.report.column">
|
||||||
|
<field name="name">Period 4</field>
|
||||||
|
<field name="expression_label">period4</field>
|
||||||
|
<field name="sortable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_receivable_report_period5" model="account.report.column">
|
||||||
|
<field name="name">Older</field>
|
||||||
|
<field name="expression_label">period5</field>
|
||||||
|
<field name="sortable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_receivable_report_total" model="account.report.column">
|
||||||
|
<field name="name">Total</field>
|
||||||
|
<field name="expression_label">total</field>
|
||||||
|
<field name="sortable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
<field name="line_ids">
|
||||||
|
<record id="aged_receivable_line" model="account.report.line">
|
||||||
|
<field name="name">Aged Receivable</field>
|
||||||
|
<field name="groupby">partner_id, id</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="aged_receivable_line_invoice_date" model="account.report.expression">
|
||||||
|
<field name="label">invoice_date</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_aged_receivable</field>
|
||||||
|
<field name="subformula">invoice_date</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_receivable_line_amount_currency" model="account.report.expression">
|
||||||
|
<field name="label">amount_currency</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_aged_receivable</field>
|
||||||
|
<field name="subformula">amount_currency</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_receivable_line_amount_currency_forced_currency" model="account.report.expression">
|
||||||
|
<field name="label">_currency_amount_currency</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_aged_receivable</field>
|
||||||
|
<field name="subformula">currency_id</field>
|
||||||
|
</record>
|
||||||
|
<record id="aged_receivable_line_currency" model="account.report.expression">
|
||||||
|
<field name="label">currency</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_aged_receivable</field>
|
||||||
|
<field name="subformula">currency</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_receivable_line_account_name" model="account.report.expression">
|
||||||
|
<field name="label">account_name</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_aged_receivable</field>
|
||||||
|
<field name="subformula">account_name</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_receivable_line_period0" model="account.report.expression">
|
||||||
|
<field name="label">period0</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_aged_receivable</field>
|
||||||
|
<field name="subformula">period0</field>
|
||||||
|
<field name="auditable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_receivable_line_period1" model="account.report.expression">
|
||||||
|
<field name="label">period1</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_aged_receivable</field>
|
||||||
|
<field name="subformula">period1</field>
|
||||||
|
<field name="auditable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_receivable_line_period2" model="account.report.expression">
|
||||||
|
<field name="label">period2</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_aged_receivable</field>
|
||||||
|
<field name="subformula">period2</field>
|
||||||
|
<field name="auditable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_receivable_line_period3" model="account.report.expression">
|
||||||
|
<field name="label">period3</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_aged_receivable</field>
|
||||||
|
<field name="subformula">period3</field>
|
||||||
|
<field name="auditable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_receivable_line_period4" model="account.report.expression">
|
||||||
|
<field name="label">period4</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_aged_receivable</field>
|
||||||
|
<field name="subformula">period4</field>
|
||||||
|
<field name="auditable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_receivable_line_period5" model="account.report.expression">
|
||||||
|
<field name="label">period5</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_aged_receivable</field>
|
||||||
|
<field name="subformula">period5</field>
|
||||||
|
<field name="auditable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_receivable_line_total" model="account.report.expression">
|
||||||
|
<field name="label">total</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_aged_receivable</field>
|
||||||
|
<field name="subformula">total</field>
|
||||||
|
<field name="auditable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="aged_payable_report" model="account.report">
|
||||||
|
<field name="name">Aged Payable</field>
|
||||||
|
<field name="filter_date_range" eval="False"/>
|
||||||
|
<field name="filter_unfold_all" eval="True"/>
|
||||||
|
<field name="filter_partner" eval="True"/>
|
||||||
|
<field name="filter_period_comparison" eval="False"/>
|
||||||
|
<field name="filter_account_type">payable</field>
|
||||||
|
<field name="filter_hierarchy">never</field>
|
||||||
|
<field name="filter_show_draft" eval="False"/>
|
||||||
|
<field name="filter_multi_company">selector</field>
|
||||||
|
<field name="default_opening_date_filter">today</field>
|
||||||
|
<field name="custom_handler_model_id" ref="model_account_aged_payable_report_handler"/>
|
||||||
|
<field name="column_ids">
|
||||||
|
<record id="aged_payable_report_invoice_date" model="account.report.column">
|
||||||
|
<field name="name">Invoice Date</field>
|
||||||
|
<field name="expression_label">invoice_date</field>
|
||||||
|
<field name="figure_type">date</field>
|
||||||
|
<field name="sortable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_payable_report_amount_currency" model="account.report.column">
|
||||||
|
<field name="name">Amount Currency</field>
|
||||||
|
<field name="expression_label">amount_currency</field>
|
||||||
|
</record>
|
||||||
|
<record id="aged_payable_report_currency" model="account.report.column">
|
||||||
|
<field name="name">Currency</field>
|
||||||
|
<field name="expression_label">currency</field>
|
||||||
|
<field name="figure_type">string</field>
|
||||||
|
</record>
|
||||||
|
<record id="aged_payable_report_account_name" model="account.report.column">
|
||||||
|
<field name="name">Account</field>
|
||||||
|
<field name="expression_label">account_name</field>
|
||||||
|
<field name="figure_type">string</field>
|
||||||
|
</record>
|
||||||
|
<record id="aged_payable_report_period0" model="account.report.column">
|
||||||
|
<field name="name">At Date</field>
|
||||||
|
<field name="expression_label">period0</field>
|
||||||
|
<field name="sortable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_payable_report_period1" model="account.report.column">
|
||||||
|
<field name="name">Period 1</field>
|
||||||
|
<field name="expression_label">period1</field>
|
||||||
|
<field name="sortable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_payable_report_period2" model="account.report.column">
|
||||||
|
<field name="name">Period 2</field>
|
||||||
|
<field name="expression_label">period2</field>
|
||||||
|
<field name="sortable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_payable_report_period3" model="account.report.column">
|
||||||
|
<field name="name">Period 3</field>
|
||||||
|
<field name="expression_label">period3</field>
|
||||||
|
<field name="sortable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_payable_report_period4" model="account.report.column">
|
||||||
|
<field name="name">Period 4</field>
|
||||||
|
<field name="expression_label">period4</field>
|
||||||
|
<field name="sortable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_payable_report_period5" model="account.report.column">
|
||||||
|
<field name="name">Older</field>
|
||||||
|
<field name="expression_label">period5</field>
|
||||||
|
<field name="sortable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_payable_report_total" model="account.report.column">
|
||||||
|
<field name="name">Total</field>
|
||||||
|
<field name="expression_label">total</field>
|
||||||
|
<field name="sortable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
<field name="line_ids">
|
||||||
|
<record id="aged_payable_line" model="account.report.line">
|
||||||
|
<field name="name">Aged Payable</field>
|
||||||
|
<field name="groupby">partner_id, id</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="aged_payable_line_invoice_date" model="account.report.expression">
|
||||||
|
<field name="label">invoice_date</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_aged_payable</field>
|
||||||
|
<field name="subformula">invoice_date</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_payable_line_amount_currency" model="account.report.expression">
|
||||||
|
<field name="label">amount_currency</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_aged_payable</field>
|
||||||
|
<field name="subformula">amount_currency</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_payable_line_amount_currency_forced_currency" model="account.report.expression">
|
||||||
|
<field name="label">_currency_amount_currency</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_aged_payable</field>
|
||||||
|
<field name="subformula">currency_id</field>
|
||||||
|
</record>
|
||||||
|
<record id="aged_payable_line_currency" model="account.report.expression">
|
||||||
|
<field name="label">currency</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_aged_payable</field>
|
||||||
|
<field name="subformula">currency</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_payable_line_account_name" model="account.report.expression">
|
||||||
|
<field name="label">account_name</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_aged_payable</field>
|
||||||
|
<field name="subformula">account_name</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_payable_line_period0" model="account.report.expression">
|
||||||
|
<field name="label">period0</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_aged_payable</field>
|
||||||
|
<field name="subformula">period0</field>
|
||||||
|
<field name="auditable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_payable_line_period1" model="account.report.expression">
|
||||||
|
<field name="label">period1</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_aged_payable</field>
|
||||||
|
<field name="subformula">period1</field>
|
||||||
|
<field name="auditable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_payable_line_period2" model="account.report.expression">
|
||||||
|
<field name="label">period2</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_aged_payable</field>
|
||||||
|
<field name="subformula">period2</field>
|
||||||
|
<field name="auditable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_payable_line_period3" model="account.report.expression">
|
||||||
|
<field name="label">period3</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_aged_payable</field>
|
||||||
|
<field name="subformula">period3</field>
|
||||||
|
<field name="auditable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_payable_line_period4" model="account.report.expression">
|
||||||
|
<field name="label">period4</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_aged_payable</field>
|
||||||
|
<field name="subformula">period4</field>
|
||||||
|
<field name="auditable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_payable_line_period5" model="account.report.expression">
|
||||||
|
<field name="label">period5</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_aged_payable</field>
|
||||||
|
<field name="subformula">period5</field>
|
||||||
|
<field name="auditable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="aged_payable_line_total" model="account.report.expression">
|
||||||
|
<field name="label">total</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_aged_payable</field>
|
||||||
|
<field name="subformula">total</field>
|
||||||
|
<field name="auditable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</odoo>
|
||||||
70
Fusion Accounting/data/assets_reports.xml
Normal file
70
Fusion Accounting/data/assets_reports.xml
Normal file
@@ -0,0 +1,70 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<odoo>
|
||||||
|
<record id="assets_report" model="account.report">
|
||||||
|
<field name="name">Depreciation Schedule</field>
|
||||||
|
<field name="filter_hierarchy">optional</field>
|
||||||
|
<field name="filter_unfold_all" eval="True"/>
|
||||||
|
<field name="filter_period_comparison" eval="False"/>
|
||||||
|
<field name="filter_journals" eval="True"/>
|
||||||
|
<field name="custom_handler_model_id" ref="model_account_asset_report_handler"/>
|
||||||
|
<field name="load_more_limit" eval="80"/>
|
||||||
|
<field name="column_ids">
|
||||||
|
<record id="assets_report_acquisition_date" model="account.report.column">
|
||||||
|
<field name="name">Acquisition Date</field>
|
||||||
|
<field name="expression_label">acquisition_date</field>
|
||||||
|
<field name="figure_type">date</field>
|
||||||
|
</record>
|
||||||
|
<record id="assets_report_first_depreciation" model="account.report.column">
|
||||||
|
<field name="name">First Depreciation</field>
|
||||||
|
<field name="expression_label">first_depreciation</field>
|
||||||
|
<field name="figure_type">date</field>
|
||||||
|
</record>
|
||||||
|
<record id="assets_report_first_method" model="account.report.column">
|
||||||
|
<field name="name">Method</field>
|
||||||
|
<field name="expression_label">method</field>
|
||||||
|
<field name="figure_type">string</field>
|
||||||
|
</record>
|
||||||
|
<record id="assets_report_duration_rate" model="account.report.column">
|
||||||
|
<field name="name">Duration / Rate</field>
|
||||||
|
<field name="expression_label">duration_rate</field>
|
||||||
|
<field name="figure_type">string</field>
|
||||||
|
</record>
|
||||||
|
<record id="assets_report_date_from" model="account.report.column">
|
||||||
|
<field name="name">date from</field>
|
||||||
|
<field name="expression_label">assets_date_from</field>
|
||||||
|
</record>
|
||||||
|
<record id="assets_report_assets_plus" model="account.report.column">
|
||||||
|
<field name="name">+</field>
|
||||||
|
<field name="expression_label">assets_plus</field>
|
||||||
|
</record>
|
||||||
|
<record id="assets_report_assets_minus" model="account.report.column">
|
||||||
|
<field name="name">-</field>
|
||||||
|
<field name="expression_label">assets_minus</field>
|
||||||
|
</record>
|
||||||
|
<record id="assets_report_assets_date_to" model="account.report.column">
|
||||||
|
<field name="name">date to</field>
|
||||||
|
<field name="expression_label">assets_date_to</field>
|
||||||
|
</record>
|
||||||
|
<record id="assets_report_depre_date_from" model="account.report.column">
|
||||||
|
<field name="name">date from</field>
|
||||||
|
<field name="expression_label">depre_date_from</field>
|
||||||
|
</record>
|
||||||
|
<record id="assets_report_depre_plus" model="account.report.column">
|
||||||
|
<field name="name">+</field>
|
||||||
|
<field name="expression_label">depre_plus</field>
|
||||||
|
</record>
|
||||||
|
<record id="assets_report_depre_minus" model="account.report.column">
|
||||||
|
<field name="name">-</field>
|
||||||
|
<field name="expression_label">depre_minus</field>
|
||||||
|
</record>
|
||||||
|
<record id="assets_report_depre_date_to" model="account.report.column">
|
||||||
|
<field name="name">date to</field>
|
||||||
|
<field name="expression_label">depre_date_to</field>
|
||||||
|
</record>
|
||||||
|
<record id="assets_report_balance" model="account.report.column">
|
||||||
|
<field name="name">book_value</field>
|
||||||
|
<field name="expression_label">balance</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</odoo>
|
||||||
285
Fusion Accounting/data/balance_sheet.xml
Normal file
285
Fusion Accounting/data/balance_sheet.xml
Normal file
@@ -0,0 +1,285 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<odoo>
|
||||||
|
<record id="balance_sheet" model="account.report">
|
||||||
|
<field name="name">Balance Sheet</field>
|
||||||
|
<field name="filter_date_range" eval="False"/>
|
||||||
|
<field name="filter_analytic_groupby" eval="True"/>
|
||||||
|
<field name="filter_unfold_all" eval="True"/>
|
||||||
|
<field name="filter_journals" eval="True"/>
|
||||||
|
<field name="filter_multi_company">selector</field>
|
||||||
|
<field name="default_opening_date_filter">today</field>
|
||||||
|
<field name="custom_handler_model_id" ref="model_account_balance_sheet_report_handler"/>
|
||||||
|
<field name="column_ids">
|
||||||
|
<record id="balance_sheet_balance" model="account.report.column">
|
||||||
|
<field name="name">Balance</field>
|
||||||
|
<field name="expression_label">balance</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
<field name="line_ids">
|
||||||
|
<record id="account_financial_report_total_assets0" model="account.report.line">
|
||||||
|
<field name="name">ASSETS</field>
|
||||||
|
<field name="hierarchy_level">0</field>
|
||||||
|
<field name="code">TA</field>
|
||||||
|
<field name="horizontal_split_side">left</field>
|
||||||
|
<field name="aggregation_formula">CA.balance + FA.balance + PNCA.balance</field>
|
||||||
|
<field name="children_ids">
|
||||||
|
<record id="account_financial_report_current_assets_view0" model="account.report.line">
|
||||||
|
<field name="name">Current Assets</field>
|
||||||
|
<field name="code">CA</field>
|
||||||
|
<field name="aggregation_formula">BA.balance + REC.balance + CAS.balance + PRE.balance</field>
|
||||||
|
<field name="children_ids">
|
||||||
|
<record id="account_financial_report_bank_view0" model="account.report.line">
|
||||||
|
<field name="name">Bank and Cash Accounts</field>
|
||||||
|
<field name="code">BA</field>
|
||||||
|
<field name="groupby">account_id</field>
|
||||||
|
<field name="foldable" eval="True"/>
|
||||||
|
<field name="domain_formula">sum([('account_id.account_type', '=', 'asset_cash')])</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_receivable0" model="account.report.line">
|
||||||
|
<field name="name">Receivables</field>
|
||||||
|
<field name="code">REC</field>
|
||||||
|
<field name="groupby">account_id</field>
|
||||||
|
<field name="foldable" eval="True"/>
|
||||||
|
<field name="domain_formula">sum([('account_id.account_type', '=', 'asset_receivable'), ('account_id.non_trade', '=', False)])</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_current_assets0" model="account.report.line">
|
||||||
|
<field name="name">Current Assets</field>
|
||||||
|
<field name="code">CAS</field>
|
||||||
|
<field name="groupby">account_id</field>
|
||||||
|
<field name="foldable" eval="True"/>
|
||||||
|
<field name="domain_formula">sum(['|', ('account_id.account_type', '=', 'asset_current'), '&', ('account_id.account_type', '=', 'asset_receivable'), ('account_id.non_trade', '=', True)])</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_prepayements0" model="account.report.line">
|
||||||
|
<field name="name">Prepayments</field>
|
||||||
|
<field name="code">PRE</field>
|
||||||
|
<field name="groupby">account_id</field>
|
||||||
|
<field name="foldable" eval="True"/>
|
||||||
|
<field name="domain_formula">sum([('account_id.account_type', '=', 'asset_prepayments')])</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_fixed_assets_view0" model="account.report.line">
|
||||||
|
<field name="name">Plus Fixed Assets</field>
|
||||||
|
<field name="code">FA</field>
|
||||||
|
<field name="groupby">account_id</field>
|
||||||
|
<field name="foldable" eval="True"/>
|
||||||
|
<field name="domain_formula">sum([('account_id.account_type', '=', 'asset_fixed')])</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_non_current_assets_view0" model="account.report.line">
|
||||||
|
<field name="name">Plus Non-current Assets</field>
|
||||||
|
<field name="code">PNCA</field>
|
||||||
|
<field name="groupby">account_id</field>
|
||||||
|
<field name="foldable" eval="True"/>
|
||||||
|
<field name="domain_formula">sum([('account_id.account_type', '=', 'asset_non_current')])</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_liabilities_view0" model="account.report.line">
|
||||||
|
<field name="name">LIABILITIES</field>
|
||||||
|
<field name="hierarchy_level">0</field>
|
||||||
|
<field name="code">L</field>
|
||||||
|
<field name="horizontal_split_side">right</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_liabilities_view0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">CL.balance + NL.balance</field>
|
||||||
|
<field name="green_on_positive" eval="False"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
<field name="children_ids">
|
||||||
|
<record id="account_financial_report_current_liabilities0" model="account.report.line">
|
||||||
|
<field name="name">Current Liabilities</field>
|
||||||
|
<field name="code">CL</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_current_liabilities0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">CL1.balance + CL2.balance</field>
|
||||||
|
<field name="green_on_positive" eval="False"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
<field name="children_ids">
|
||||||
|
<record id="account_financial_report_current_liabilities1" model="account.report.line">
|
||||||
|
<field name="name">Current Liabilities</field>
|
||||||
|
<field name="code">CL1</field>
|
||||||
|
<field name="groupby">account_id</field>
|
||||||
|
<field name="foldable" eval="True"/>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_current_liabilities1_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">domain</field>
|
||||||
|
<field name="formula" eval="['|', ('account_id.account_type', 'in', ('liability_current', 'liability_credit_card')), '&', ('account_id.account_type', '=', 'liability_payable'), ('account_id.non_trade', '=', True)]"/>
|
||||||
|
<field name="subformula">-sum</field>
|
||||||
|
<field name="green_on_positive" eval="False"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_current_liabilities_payable" model="account.report.line">
|
||||||
|
<field name="name">Payables</field>
|
||||||
|
<field name="code">CL2</field>
|
||||||
|
<field name="groupby">account_id</field>
|
||||||
|
<field name="foldable" eval="True"/>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_current_liabilities_payable_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">domain</field>
|
||||||
|
<field name="formula" eval="[('account_id.account_type', '=', 'liability_payable'), ('account_id.non_trade', '=', False)]"/>
|
||||||
|
<field name="subformula">-sum</field>
|
||||||
|
<field name="green_on_positive" eval="False"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_non_current_liabilities0" model="account.report.line">
|
||||||
|
<field name="name">Plus Non-current Liabilities</field>
|
||||||
|
<field name="code">NL</field>
|
||||||
|
<field name="groupby">account_id</field>
|
||||||
|
<field name="foldable" eval="True"/>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_non_current_liabilities0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">domain</field>
|
||||||
|
<field name="formula" eval="[('account_id.account_type', '=', 'liability_non_current')]"/>
|
||||||
|
<field name="subformula">-sum</field>
|
||||||
|
<field name="green_on_positive" eval="False"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_equity0" model="account.report.line">
|
||||||
|
<field name="name">EQUITY</field>
|
||||||
|
<field name="hierarchy_level">0</field>
|
||||||
|
<field name="code">EQ</field>
|
||||||
|
<field name="horizontal_split_side">right</field>
|
||||||
|
<field name="aggregation_formula">UNAFFECTED_EARNINGS.balance + RETAINED_EARNINGS.balance</field>
|
||||||
|
<field name="children_ids">
|
||||||
|
<record id="account_financial_unaffected_earnings0" model="account.report.line">
|
||||||
|
<field name="name">Unallocated Earnings</field>
|
||||||
|
<field name="code">UNAFFECTED_EARNINGS</field>
|
||||||
|
<field name="aggregation_formula">CURR_YEAR_EARNINGS.balance + PREV_YEAR_EARNINGS.balance</field>
|
||||||
|
<field name="children_ids">
|
||||||
|
<record id="account_financial_current_year_earnings0" model="account.report.line">
|
||||||
|
<field name="name">Current Year Unallocated Earnings</field>
|
||||||
|
<field name="code">CURR_YEAR_EARNINGS</field>
|
||||||
|
<field name="aggregation_formula"/>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_current_year_earnings_pnl" model="account.report.expression">
|
||||||
|
<field name="label">pnl</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">NEP.balance</field>
|
||||||
|
<field name="date_scope">from_fiscalyear</field>
|
||||||
|
<field name="subformula">cross_report</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_current_year_earnings_alloc" model="account.report.expression">
|
||||||
|
<field name="label">alloc</field>
|
||||||
|
<field name="engine">domain</field>
|
||||||
|
<field name="formula" eval="[('account_id.account_type', '=', 'equity_unaffected')]"/>
|
||||||
|
<field name="date_scope">from_fiscalyear</field>
|
||||||
|
<field name="subformula">-sum</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_current_year_earnings_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">CURR_YEAR_EARNINGS.pnl + CURR_YEAR_EARNINGS.alloc</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_previous_year_earnings0" model="account.report.line">
|
||||||
|
<field name="name">Previous Years Unallocated Earnings</field>
|
||||||
|
<field name="code">PREV_YEAR_EARNINGS</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_previous_year_earnings0_allocated_earnings" model="account.report.expression">
|
||||||
|
<field name="label">allocated_earnings</field>
|
||||||
|
<field name="engine">domain</field>
|
||||||
|
<field name="formula" eval="[('account_id.account_type', '=', 'equity_unaffected')]"/>
|
||||||
|
<field name="subformula">-sum</field>
|
||||||
|
<field name="date_scope">from_beginning</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_previous_year_earnings0_balance_domain" model="account.report.expression">
|
||||||
|
<field name="label">balance_domain</field>
|
||||||
|
<field name="engine">domain</field>
|
||||||
|
<field name="formula" eval="[('account_id.account_type', 'in', ('income', 'income_other', 'expense_direct_cost', 'expense', 'expense_depreciation'))]"/>
|
||||||
|
<field name="subformula">-sum</field>
|
||||||
|
<field name="date_scope">from_beginning</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_previous_year_earnings0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">PREV_YEAR_EARNINGS.balance_domain + PREV_YEAR_EARNINGS.allocated_earnings - CURR_YEAR_EARNINGS.balance</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_retained_earnings0" model="account.report.line">
|
||||||
|
<field name="name">Retained Earnings</field>
|
||||||
|
<field name="code">RETAINED_EARNINGS</field>
|
||||||
|
<field name="aggregation_formula">CURR_RETAINED_EARNINGS.balance + PREV_RETAINED_EARNINGS.balance</field>
|
||||||
|
<field name="groupby" eval="False"/>
|
||||||
|
<field name="foldable" eval="False"/>
|
||||||
|
<field name="children_ids">
|
||||||
|
<record id="account_financial_retained_earnings_line_1" model="account.report.line">
|
||||||
|
<field name="name">Current Year Retained Earnings</field>
|
||||||
|
<field name="code">CURR_RETAINED_EARNINGS</field>
|
||||||
|
<field name="groupby">account_id</field>
|
||||||
|
<field name="foldable" eval="True"/>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_retained_earnings_current" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">domain</field>
|
||||||
|
<field name="formula" eval="[('account_id.account_type', '=', 'equity')]"/>
|
||||||
|
<field name="subformula">-sum</field>
|
||||||
|
<field name="date_scope">from_fiscalyear</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_retained_earnings_line_2" model="account.report.line">
|
||||||
|
<field name="name">Previous Years Retained Earnings</field>
|
||||||
|
<field name="code">PREV_RETAINED_EARNINGS</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_retained_earnings_total" model="account.report.expression">
|
||||||
|
<field name="label">total</field>
|
||||||
|
<field name="engine">domain</field>
|
||||||
|
<field name="formula" eval="[('account_id.account_type', '=', 'equity')]"/>
|
||||||
|
<field name="subformula">-sum</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_retained_earnings_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">PREV_RETAINED_EARNINGS.total - CURR_RETAINED_EARNINGS.balance</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_liabilities_and_equity_view0" model="account.report.line">
|
||||||
|
<field name="name">LIABILITIES + EQUITY</field>
|
||||||
|
<field name="hierarchy_level">0</field>
|
||||||
|
<field name="code">LE</field>
|
||||||
|
<field name="horizontal_split_side">right</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_liabilities_and_equity_view0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">L.balance + EQ.balance</field>
|
||||||
|
<field name="green_on_positive" eval="False"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_off_sheet" model="account.report.line">
|
||||||
|
<field name="name">OFF BALANCE SHEET ACCOUNTS</field>
|
||||||
|
<field name="hierarchy_level">0</field>
|
||||||
|
<field name="code">OS</field>
|
||||||
|
<field name="groupby">account_id</field>
|
||||||
|
<field name="foldable" eval="True"/>
|
||||||
|
<field name="hide_if_zero" eval="1"/>
|
||||||
|
<field name="domain_formula">-sum([('account_id.account_type', '=', 'off_balance')])</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</odoo>
|
||||||
474
Fusion Accounting/data/bank_reconciliation_report.xml
Normal file
474
Fusion Accounting/data/bank_reconciliation_report.xml
Normal file
@@ -0,0 +1,474 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<odoo>
|
||||||
|
<record id="bank_reconciliation_report" model="account.report">
|
||||||
|
<field name="name">Bank Reconciliation Report</field>
|
||||||
|
<field name="filter_show_draft" eval="True"/>
|
||||||
|
<field name="filter_date_range" eval="False"/>
|
||||||
|
<field name="filter_period_comparison" eval="False"/>
|
||||||
|
<field name="filter_hide_0_lines">by_default</field>
|
||||||
|
<field name="search_bar" eval="True"/>
|
||||||
|
<field name="default_opening_date_filter">today</field>
|
||||||
|
<field name="custom_handler_model_id" ref="model_account_bank_reconciliation_report_handler"/>
|
||||||
|
<field name="column_ids">
|
||||||
|
<record id="bank_reconciliation_report_date" model="account.report.column">
|
||||||
|
<field name="name">Date</field>
|
||||||
|
<field name="expression_label">date</field>
|
||||||
|
<field name="figure_type">date</field>
|
||||||
|
</record>
|
||||||
|
<record id="bank_reconciliation_report_label" model="account.report.column">
|
||||||
|
<field name="name">Label</field>
|
||||||
|
<field name="expression_label">label</field>
|
||||||
|
<field name="figure_type">string</field>
|
||||||
|
</record>
|
||||||
|
<record id="bank_reconciliation_report_amount_currency" model="account.report.column">
|
||||||
|
<field name="name">Amount Currency</field>
|
||||||
|
<field name="expression_label">amount_currency</field>
|
||||||
|
<field name="figure_type">monetary</field>
|
||||||
|
</record>
|
||||||
|
<record id="bank_reconciliation_report_currency" model="account.report.column">
|
||||||
|
<field name="name">Currency</field>
|
||||||
|
<field name="expression_label">currency</field>
|
||||||
|
<field name="figure_type">string</field>
|
||||||
|
</record>
|
||||||
|
<record id="bank_reconciliation_report_amount" model="account.report.column">
|
||||||
|
<field name="name">Amount</field>
|
||||||
|
<field name="expression_label">amount</field>
|
||||||
|
<field name="figure_type">monetary</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
<field name="line_ids">
|
||||||
|
<record id="balance_bank" model="account.report.line">
|
||||||
|
<field name="name">Balance of Bank</field>
|
||||||
|
<field name="code">balance_bank</field>
|
||||||
|
<field name="hierarchy_level">0</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="balance_bank_expr" model="account.report.expression">
|
||||||
|
<field name="label">amount</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">last_statement_balance.amount + transaction_without_statement.amount + misc_operations.amount</field>
|
||||||
|
<field name="auditable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="balance_bank_expr_forced_currency_amount" model="account.report.expression">
|
||||||
|
<field name="label">_currency_amount</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_forced_currency_amount</field>
|
||||||
|
<field name="subformula">amount_currency_id</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
<field name="children_ids">
|
||||||
|
<record id="last_statement_balance" model="account.report.line">
|
||||||
|
<field name="name">Last statement balance</field>
|
||||||
|
<field name="code">last_statement_balance</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="last_statement_balance_amount" model="account.report.expression">
|
||||||
|
<field name="label">amount</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_last_statement_balance_amount</field>
|
||||||
|
<field name="subformula">amount</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="last_statement_balance_forced_currency_amount" model="account.report.expression">
|
||||||
|
<field name="label">_currency_amount</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_last_statement_balance_amount</field>
|
||||||
|
<field name="subformula">amount_currency_id</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
<field name="children_ids">
|
||||||
|
<record id="unreconciled_last_statement_receipts" model="account.report.line">
|
||||||
|
<field name="name">Including Unreconciled Receipts</field>
|
||||||
|
<field name="code">last_statement_receipts</field>
|
||||||
|
<field name="groupby">id</field>
|
||||||
|
<field name="foldable" eval="True"/>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="unreconciled_last_statement_receipts_date" model="account.report.expression">
|
||||||
|
<field name="label">date</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_unreconciled_last_statement_receipts</field>
|
||||||
|
<field name="subformula">date</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="unreconciled_last_statement_receipts_label" model="account.report.expression">
|
||||||
|
<field name="label">label</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_unreconciled_last_statement_receipts</field>
|
||||||
|
<field name="subformula">label</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="unreconciled_last_statement_receipts_amount_currency" model="account.report.expression">
|
||||||
|
<field name="label">amount_currency</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_unreconciled_last_statement_receipts</field>
|
||||||
|
<field name="subformula">amount_currency</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="unreconciled_last_statement_receipts_forced_currency_amount_currency" model="account.report.expression">
|
||||||
|
<field name="label">_currency_amount_currency</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_unreconciled_last_statement_receipts</field>
|
||||||
|
<field name="subformula">amount_currency_currency_id</field>
|
||||||
|
</record>
|
||||||
|
<record id="unreconciled_last_statement_receipts_currency" model="account.report.expression">
|
||||||
|
<field name="label">currency</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_unreconciled_last_statement_receipts</field>
|
||||||
|
<field name="subformula">currency</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="unreconciled_last_statement_receipts_amount" model="account.report.expression">
|
||||||
|
<field name="label">amount</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_unreconciled_last_statement_receipts</field>
|
||||||
|
<field name="subformula">amount</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="unreconciled_last_statement_receipts_forced_currency_amount" model="account.report.expression">
|
||||||
|
<field name="label">_currency_amount</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_unreconciled_last_statement_receipts</field>
|
||||||
|
<field name="subformula">amount_currency_id</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="unreconciled_last_statement_payments" model="account.report.line">
|
||||||
|
<field name="name">Including Unreconciled Payments</field>
|
||||||
|
<field name="code">last_statement_payments</field>
|
||||||
|
<field name="groupby">id</field>
|
||||||
|
<field name="foldable" eval="True"/>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="unreconciled_last_statement_payments_date" model="account.report.expression">
|
||||||
|
<field name="label">date</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_unreconciled_last_statement_payments</field>
|
||||||
|
<field name="subformula">date</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="unreconciled_last_statement_payments_label" model="account.report.expression">
|
||||||
|
<field name="label">label</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_unreconciled_last_statement_payments</field>
|
||||||
|
<field name="subformula">label</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="unreconciled_last_statement_payments_amount_currency" model="account.report.expression">
|
||||||
|
<field name="label">amount_currency</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_unreconciled_last_statement_payments</field>
|
||||||
|
<field name="subformula">amount_currency</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="unreconciled_last_statement_payments_forced_currency_amount_currency" model="account.report.expression">
|
||||||
|
<field name="label">_currency_amount_currency</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_unreconciled_last_statement_payments</field>
|
||||||
|
<field name="subformula">amount_currency_currency_id</field>
|
||||||
|
</record>
|
||||||
|
<record id="unreconciled_last_statement_payments_currency" model="account.report.expression">
|
||||||
|
<field name="label">currency</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_unreconciled_last_statement_payments</field>
|
||||||
|
<field name="subformula">currency</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="unreconciled_last_statement_payments_amount" model="account.report.expression">
|
||||||
|
<field name="label">amount</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_unreconciled_last_statement_payments</field>
|
||||||
|
<field name="subformula">amount</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="unreconciled_last_statement_payments_forced_currency_amount" model="account.report.expression">
|
||||||
|
<field name="label">_currency_amount</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_unreconciled_last_statement_payments</field>
|
||||||
|
<field name="subformula">amount_currency_id</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="transaction_without_statement" model="account.report.line">
|
||||||
|
<field name="name">Transactions without statement</field>
|
||||||
|
<field name="code">transaction_without_statement</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="transaction_without_statement_expr" model="account.report.expression">
|
||||||
|
<field name="label">amount</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_transaction_without_statement_amount</field>
|
||||||
|
<field name="subformula">amount</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="transaction_without_statement_expr_forced_currency_amount" model="account.report.expression">
|
||||||
|
<field name="label">_currency_amount</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_transaction_without_statement_amount</field>
|
||||||
|
<field name="subformula">amount_currency_id</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
<field name="children_ids">
|
||||||
|
<record id="no_statement_unreconciled_receipt" model="account.report.line">
|
||||||
|
<field name="name">Including Unreconciled Receipts</field>
|
||||||
|
<field name="code">unreconciled_receipt</field>
|
||||||
|
<field name="groupby">id</field>
|
||||||
|
<field name="foldable" eval="True"/>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="no_statement_unreconciled_receipt_date" model="account.report.expression">
|
||||||
|
<field name="label">date</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_unreconciled_receipts</field>
|
||||||
|
<field name="subformula">date</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="no_statement_unreconciled_receipt_label" model="account.report.expression">
|
||||||
|
<field name="label">label</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_unreconciled_receipts</field>
|
||||||
|
<field name="subformula">label</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="no_statement_unreconciled_receipt_amount_currency" model="account.report.expression">
|
||||||
|
<field name="label">amount_currency</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_unreconciled_receipts</field>
|
||||||
|
<field name="subformula">amount_currency</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="no_statement_unreconciled_receipt_forced_currency_amount_currency" model="account.report.expression">
|
||||||
|
<field name="label">_currency_amount_currency</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_unreconciled_receipts</field>
|
||||||
|
<field name="subformula">amount_currency_currency_id</field>
|
||||||
|
</record>
|
||||||
|
<record id="no_statement_unreconciled_receipt_currency" model="account.report.expression">
|
||||||
|
<field name="label">currency</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_unreconciled_receipts</field>
|
||||||
|
<field name="subformula">currency</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="no_statement_unreconciled_receipt_amount" model="account.report.expression">
|
||||||
|
<field name="label">amount</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_unreconciled_receipts</field>
|
||||||
|
<field name="subformula">amount</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="no_statement_unreconciled_receipt_forced_currency_amount" model="account.report.expression">
|
||||||
|
<field name="label">_currency_amount</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_unreconciled_receipts</field>
|
||||||
|
<field name="subformula">amount_currency_id</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="no_statement_unreconciled_payments" model="account.report.line">
|
||||||
|
<field name="name">Including Unreconciled Payments</field>
|
||||||
|
<field name="code">unreconciled_payments</field>
|
||||||
|
<field name="groupby">id</field>
|
||||||
|
<field name="foldable" eval="True"/>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="no_statement_unreconciled_payments_date" model="account.report.expression">
|
||||||
|
<field name="label">date</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_unreconciled_payments</field>
|
||||||
|
<field name="subformula">date</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="no_statement_unreconciled_payments_label" model="account.report.expression">
|
||||||
|
<field name="label">label</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_unreconciled_payments</field>
|
||||||
|
<field name="subformula">label</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="no_statement_unreconciled_payments_amount_currency" model="account.report.expression">
|
||||||
|
<field name="label">amount_currency</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_unreconciled_payments</field>
|
||||||
|
<field name="subformula">amount_currency</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="no_statement_unreconciled_payments_forced_currency_amount_currency" model="account.report.expression">
|
||||||
|
<field name="label">_currency_amount_currency</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_unreconciled_payments</field>
|
||||||
|
<field name="subformula">amount_currency_currency_id</field>
|
||||||
|
</record>
|
||||||
|
<record id="no_statement_unreconciled_payments_currency" model="account.report.expression">
|
||||||
|
<field name="label">currency</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_unreconciled_payments</field>
|
||||||
|
<field name="subformula">currency</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="no_statement_unreconciled_payments_amount" model="account.report.expression">
|
||||||
|
<field name="label">amount</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_unreconciled_payments</field>
|
||||||
|
<field name="subformula">amount</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="no_statement_unreconciled_payments_forced_currency_amount" model="account.report.expression">
|
||||||
|
<field name="label">_currency_amount</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_unreconciled_payments</field>
|
||||||
|
<field name="subformula">amount_currency_id</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="misc_operations" model="account.report.line">
|
||||||
|
<field name="name">Misc. operations</field>
|
||||||
|
<field name="code">misc_operations</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="misc_operations_amount" model="account.report.expression">
|
||||||
|
<field name="label">amount</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_misc_operations</field>
|
||||||
|
<field name="subformula">amount</field>
|
||||||
|
<field name="auditable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="misc_operations_amount_forced_currency_amount" model="account.report.expression">
|
||||||
|
<field name="label">_currency_amount</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_misc_operations</field>
|
||||||
|
<field name="subformula">amount_currency_id</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="outstanding" model="account.report.line">
|
||||||
|
<field name="name">Outstanding Receipts/Payments</field>
|
||||||
|
<field name="hierarchy_level">0</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="outstanding_expr" model="account.report.expression">
|
||||||
|
<field name="label">amount</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">outstanding_receipts.amount + outstanding_payments.amount</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="outstanding_expr_forced_currency_amount" model="account.report.expression">
|
||||||
|
<field name="label">_currency_amount</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_forced_currency_amount</field>
|
||||||
|
<field name="subformula">amount_currency_id</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
<field name="children_ids">
|
||||||
|
<record id="outstanding_receipts" model="account.report.line">
|
||||||
|
<field name="name">(+) Outstanding Receipts</field>
|
||||||
|
<field name="code">outstanding_receipts</field>
|
||||||
|
<field name="groupby">id</field>
|
||||||
|
<field name="foldable" eval="True"/>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="outstanding_receipts_date" model="account.report.expression">
|
||||||
|
<field name="label">date</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_outstanding_receipts</field>
|
||||||
|
<field name="subformula">date</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="outstanding_receipts_label" model="account.report.expression">
|
||||||
|
<field name="label">label</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_outstanding_receipts</field>
|
||||||
|
<field name="subformula">label</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="outstanding_receipts_amount_currency" model="account.report.expression">
|
||||||
|
<field name="label">amount_currency</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_outstanding_receipts</field>
|
||||||
|
<field name="subformula">amount_currency</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="outstanding_receipts_forced_currency_amount_currency" model="account.report.expression">
|
||||||
|
<field name="label">_currency_amount_currency</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_outstanding_receipts</field>
|
||||||
|
<field name="subformula">amount_currency_currency_id</field>
|
||||||
|
</record>
|
||||||
|
<record id="outstanding_receipts_currency" model="account.report.expression">
|
||||||
|
<field name="label">currency</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_outstanding_receipts</field>
|
||||||
|
<field name="subformula">currency</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="outstanding_receipts_amount" model="account.report.expression">
|
||||||
|
<field name="label">amount</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_outstanding_receipts</field>
|
||||||
|
<field name="subformula">amount</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="outstanding_receipts_forced_currency_amount" model="account.report.expression">
|
||||||
|
<field name="label">_currency_amount</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_outstanding_receipts</field>
|
||||||
|
<field name="subformula">amount_currency_id</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="outstanding_payments" model="account.report.line">
|
||||||
|
<field name="name">(-) Outstanding Payments</field>
|
||||||
|
<field name="code">outstanding_payments</field>
|
||||||
|
<field name="groupby">id</field>
|
||||||
|
<field name="foldable" eval="True"/>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="outstanding_payments_date" model="account.report.expression">
|
||||||
|
<field name="label">date</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_outstanding_payments</field>
|
||||||
|
<field name="subformula">date</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="outstanding_payments_label" model="account.report.expression">
|
||||||
|
<field name="label">label</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_outstanding_payments</field>
|
||||||
|
<field name="subformula">label</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="outstanding_payments_amount_currency" model="account.report.expression">
|
||||||
|
<field name="label">amount_currency</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_outstanding_payments</field>
|
||||||
|
<field name="subformula">amount_currency</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="outstanding_payments_forced_currency_amount_currency" model="account.report.expression">
|
||||||
|
<field name="label">_currency_amount_currency</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_outstanding_payments</field>
|
||||||
|
<field name="subformula">amount_currency_currency_id</field>
|
||||||
|
</record>
|
||||||
|
<record id="outstanding_payments_currency" model="account.report.expression">
|
||||||
|
<field name="label">currency</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_outstanding_payments</field>
|
||||||
|
<field name="subformula">currency</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="outstanding_payments_amount" model="account.report.expression">
|
||||||
|
<field name="label">amount</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_outstanding_payments</field>
|
||||||
|
<field name="subformula">amount</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="outstanding_payments_forced_currency_amount" model="account.report.expression">
|
||||||
|
<field name="label">_currency_amount</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_outstanding_payments</field>
|
||||||
|
<field name="subformula">amount_currency_id</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</odoo>
|
||||||
19
Fusion Accounting/data/cash_flow_report.xml
Normal file
19
Fusion Accounting/data/cash_flow_report.xml
Normal file
@@ -0,0 +1,19 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<odoo>
|
||||||
|
<record id="cash_flow_report" model="account.report">
|
||||||
|
<field name="name">Cash Flow Statement</field>
|
||||||
|
<field name="filter_unfold_all" eval="True"/>
|
||||||
|
<field name="filter_date_range" eval="True"/>
|
||||||
|
<field name="filter_journals" eval="True"/>
|
||||||
|
<field name="filter_period_comparison" eval="False"/>
|
||||||
|
<field name="filter_multi_company">selector</field>
|
||||||
|
<field name="currency_translation">current</field>
|
||||||
|
<field name="custom_handler_model_id" ref="model_account_cash_flow_report_handler"/>
|
||||||
|
<field name="column_ids">
|
||||||
|
<record id="cash_flow_report_balance" model="account.report.column">
|
||||||
|
<field name="name">Balance</field>
|
||||||
|
<field name="expression_label">balance</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</odoo>
|
||||||
43
Fusion Accounting/data/deferred_reports.xml
Normal file
43
Fusion Accounting/data/deferred_reports.xml
Normal file
@@ -0,0 +1,43 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<odoo>
|
||||||
|
|
||||||
|
<record id="deferred_expense_report" model="account.report">
|
||||||
|
<field name="name">Deferred Expense Report</field>
|
||||||
|
<field name="filter_journals" eval="True"/>
|
||||||
|
<field name="filter_analytic" eval="True"/>
|
||||||
|
<field name="filter_period_comparison" eval="True"/>
|
||||||
|
<field name="filter_growth_comparison" eval="False"/>
|
||||||
|
<field name="filter_multi_company">selector</field>
|
||||||
|
<field name="filter_unfold_all" eval="True"/>
|
||||||
|
<field name="filter_hierarchy">by_default</field>
|
||||||
|
<field name="default_opening_date_filter">previous_month</field>
|
||||||
|
<field name="search_bar" eval="True"/>
|
||||||
|
<field name="custom_handler_model_id" ref="model_account_deferred_expense_report_handler"/>
|
||||||
|
<field name="column_ids">
|
||||||
|
<record id="deferred_expense_current" model="account.report.column">
|
||||||
|
<field name="name">Current</field>
|
||||||
|
<field name="expression_label">current</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="deferred_revenue_report" model="account.report">
|
||||||
|
<field name="name">Deferred Revenue Report</field>
|
||||||
|
<field name="filter_journals" eval="True"/>
|
||||||
|
<field name="filter_analytic" eval="True"/>
|
||||||
|
<field name="filter_period_comparison" eval="True"/>
|
||||||
|
<field name="filter_growth_comparison" eval="False"/>
|
||||||
|
<field name="filter_multi_company">selector</field>
|
||||||
|
<field name="filter_unfold_all" eval="True"/>
|
||||||
|
<field name="filter_hierarchy">by_default</field>
|
||||||
|
<field name="default_opening_date_filter">previous_month</field>
|
||||||
|
<field name="search_bar" eval="True"/>
|
||||||
|
<field name="custom_handler_model_id" ref="model_account_deferred_revenue_report_handler"/>
|
||||||
|
<field name="column_ids">
|
||||||
|
<record id="deferred_revenue_current" model="account.report.column">
|
||||||
|
<field name="name">Current</field>
|
||||||
|
<field name="expression_label">current</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</odoo>
|
||||||
37
Fusion Accounting/data/digest_data.xml
Normal file
37
Fusion Accounting/data/digest_data.xml
Normal file
@@ -0,0 +1,37 @@
|
|||||||
|
<?xml version='1.0' encoding='utf-8'?>
|
||||||
|
<odoo>
|
||||||
|
<data noupdate="1">
|
||||||
|
<record id="digest.digest_digest_default" model="digest.digest">
|
||||||
|
<field name="kpi_account_bank_cash">True</field>
|
||||||
|
</record>
|
||||||
|
</data>
|
||||||
|
<data noupdate="0">
|
||||||
|
<record id="digest_tip_fusion_accounting_0" model="digest.tip">
|
||||||
|
<field name="name">Tip: Bulk update journal items</field>
|
||||||
|
<field name="sequence">900</field>
|
||||||
|
<field name="group_id" ref="account.group_account_user" />
|
||||||
|
<field name="tip_description" type="html">
|
||||||
|
<div>
|
||||||
|
<b class="tip_title">Tip: Bulk update journal items</b>
|
||||||
|
<p class="tip_content">From any list view, select multiple records and the list becomes editable. If you update a cell, selected records are updated all at once. Use this feature to update multiple journal entries from the General Ledger, or any Journal view.</p>
|
||||||
|
<img src="https://download.odoocdn.com/digests/fusion_accounting/static/src/img/milk-accounting-bulk.gif" width="540" class="illustration_border" />
|
||||||
|
</div>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="digest_tip_fusion_accounting_1" model="digest.tip">
|
||||||
|
<field name="name">Tip: Find an Accountant or register your Accounting Firm</field>
|
||||||
|
<field name="sequence">1000</field>
|
||||||
|
<field name="group_id" ref="account.group_account_user" />
|
||||||
|
<field name="tip_description" type="html">
|
||||||
|
<div>
|
||||||
|
<b class="tip_title">Tip: Find an Accountant or register your Accounting Firm</b>
|
||||||
|
<p class="tip_content">Click here to find an accountant or if you want to list out your accounting services on Odoo</p>
|
||||||
|
<p class="mt-3">
|
||||||
|
<a class="tip_button" href="https://odoo.com/accounting-firms" target="_blank"><span class="tip_button_text">Find an Accountant</span></a>
|
||||||
|
<a class="tip_button" href="https://odoo.com/accounting-firms/register" target="_blank"><span class="tip_button_text">Register your Accounting Firm</span></a>
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</data>
|
||||||
|
</odoo>
|
||||||
395
Fusion Accounting/data/executive_summary.xml
Normal file
395
Fusion Accounting/data/executive_summary.xml
Normal file
@@ -0,0 +1,395 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<odoo>
|
||||||
|
<record id="executive_summary" model="account.report">
|
||||||
|
<field name="name">Executive Summary</field>
|
||||||
|
<field name="filter_multi_company">selector</field>
|
||||||
|
<field name="default_opening_date_filter">this_year</field>
|
||||||
|
<field name="column_ids">
|
||||||
|
<record id="executive_summary_column" model="account.report.column">
|
||||||
|
<field name="name">Balance</field>
|
||||||
|
<field name="expression_label">balance</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
<field name="line_ids">
|
||||||
|
<record id="account_financial_report_executivesummary_cash0" model="account.report.line">
|
||||||
|
<field name="name">Cash</field>
|
||||||
|
<field name="hierarchy_level">0</field>
|
||||||
|
<field name="children_ids">
|
||||||
|
<record id="account_financial_report_executivesummary_cash_received0" model="account.report.line">
|
||||||
|
<field name="name">Cash received</field>
|
||||||
|
<field name="code">CR</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_executivesummary_cash_received0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">domain</field>
|
||||||
|
<field name="formula" eval="[('account_id.account_type', 'in', ('asset_cash', 'liability_credit_card')), ('debit', '>', 0.0)]"/>
|
||||||
|
<field name="subformula">sum</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_cash_spent0" model="account.report.line">
|
||||||
|
<field name="name">Cash spent</field>
|
||||||
|
<field name="code">CS</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_executivesummary_cash_spent0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">domain</field>
|
||||||
|
<field name="formula" eval="[('account_id.account_type', 'in', ('asset_cash', 'liability_credit_card')), ('credit', '>', 0.0)]"/>
|
||||||
|
<field name="subformula">sum</field>
|
||||||
|
<field name="green_on_positive" eval="False"/>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_cash_surplus0" model="account.report.line">
|
||||||
|
<field name="name">Cash surplus</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_executivesummary_cash_surplus0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">CR.balance + CS.balance</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_closing_bank_balance0" model="account.report.line">
|
||||||
|
<field name="name">Closing bank balance</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_executivesummary_closing_bank_balance0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">domain</field>
|
||||||
|
<field name="formula" eval="[('account_id.account_type', 'in', ('asset_cash', 'liability_credit_card'))]"/>
|
||||||
|
<field name="date_scope">from_beginning</field>
|
||||||
|
<field name="subformula">sum</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_profitability0" model="account.report.line">
|
||||||
|
<field name="name">Profitability</field>
|
||||||
|
<field name="hierarchy_level">0</field>
|
||||||
|
<field name="children_ids">
|
||||||
|
<record id="account_financial_report_executivesummary_income0" model="account.report.line">
|
||||||
|
<field name="name">Revenue</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_executivesummary_income0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">REV.balance</field>
|
||||||
|
<field name="date_scope">strict_range</field>
|
||||||
|
<field name="subformula">cross_report</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_direct_costs0" model="account.report.line">
|
||||||
|
<field name="name">Cost of Revenue</field>
|
||||||
|
<field name="code">EXEC_COS</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_executivesummary_direct_costs0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">COS.balance</field>
|
||||||
|
<field name="date_scope">strict_range</field>
|
||||||
|
<field name="subformula">cross_report</field>
|
||||||
|
<field name="green_on_positive" eval="False"/>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_gross_profit0" model="account.report.line">
|
||||||
|
<field name="name">Gross profit</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_executivesummary_gross_profit0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">GRP.balance</field>
|
||||||
|
<field name="date_scope">strict_range</field>
|
||||||
|
<field name="subformula">cross_report</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_expenses0" model="account.report.line">
|
||||||
|
<field name="name">Expenses</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_executivesummary_expenses0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">EXP.balance</field>
|
||||||
|
<field name="date_scope">strict_range</field>
|
||||||
|
<field name="subformula">cross_report</field>
|
||||||
|
<field name="green_on_positive" eval="False"/>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_profit0" model="account.report.line">
|
||||||
|
<field name="name">Net Profit</field>
|
||||||
|
<field name="code">EXEC_NEP</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_executivesummary_profit0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">NEP.balance</field>
|
||||||
|
<field name="date_scope">strict_range</field>
|
||||||
|
<field name="subformula">cross_report</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_balancesheet0" model="account.report.line">
|
||||||
|
<field name="name">Balance Sheet</field>
|
||||||
|
<field name="hierarchy_level">0</field>
|
||||||
|
<field name="children_ids">
|
||||||
|
<record id="account_financial_report_executivesummary_debtors0" model="account.report.line">
|
||||||
|
<field name="name">Receivables</field>
|
||||||
|
<field name="code">DEB</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_executivesummary_debtors0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">domain</field>
|
||||||
|
<field name="formula" eval="[('account_id.account_type', '=', 'asset_receivable')]"/>
|
||||||
|
<field name="date_scope">from_beginning</field>
|
||||||
|
<field name="subformula">sum</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_creditors0" model="account.report.line">
|
||||||
|
<field name="name">Payables</field>
|
||||||
|
<field name="code">CRE</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_executivesummary_creditors0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">domain</field>
|
||||||
|
<field name="formula" eval="[('account_id.account_type', '=', 'liability_payable')]"/>
|
||||||
|
<field name="date_scope">from_beginning</field>
|
||||||
|
<field name="subformula">sum</field>
|
||||||
|
<field name="green_on_positive" eval="False"/>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_net_assets0" model="account.report.line">
|
||||||
|
<field name="name">Net assets</field>
|
||||||
|
<field name="code">EXEC_SUMMARY_NA</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_executivesummary_net_assets0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">TA.balance - L.balance</field>
|
||||||
|
<field name="date_scope">from_beginning</field>
|
||||||
|
<field name="subformula">cross_report</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_performance0" model="account.report.line">
|
||||||
|
<field name="name">Performance</field>
|
||||||
|
<field name="hierarchy_level">0</field>
|
||||||
|
<field name="children_ids">
|
||||||
|
<record id="account_financial_report_executivesummary_gpmargin0" model="account.report.line">
|
||||||
|
<field name="name">Gross profit margin (gross profit / operating income)</field>
|
||||||
|
<field name="code">GPMARGIN0</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_executivesummary_gpmargin0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">GPMARGIN0.grp / GPMARGIN0.opinc * 100</field>
|
||||||
|
<field name="subformula">ignore_zero_division</field>
|
||||||
|
<field name="figure_type">percentage</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_gpmargin0_grp" model="account.report.expression">
|
||||||
|
<field name="label">grp</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">GRP.balance</field>
|
||||||
|
<field name="date_scope">strict_range</field>
|
||||||
|
<field name="subformula">cross_report</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_gpmargin0_opinc" model="account.report.expression">
|
||||||
|
<field name="label">opinc</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">REV.balance</field>
|
||||||
|
<field name="date_scope">strict_range</field>
|
||||||
|
<field name="subformula">cross_report</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_npmargin0" model="account.report.line">
|
||||||
|
<field name="name">Net profit margin (net profit / income)</field>
|
||||||
|
<field name="code">NPMARGIN0</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_executivesummary_npmargin0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">NPMARGIN0.nep / NPMARGIN0.inc * 100</field>
|
||||||
|
<field name="subformula">ignore_zero_division</field>
|
||||||
|
<field name="figure_type">percentage</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_npmargin0_nep" model="account.report.expression">
|
||||||
|
<field name="label">nep</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">NEP.balance</field>
|
||||||
|
<field name="date_scope">strict_range</field>
|
||||||
|
<field name="subformula">cross_report</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_npmargin0_inc" model="account.report.expression">
|
||||||
|
<field name="label">inc</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">INC.balance</field>
|
||||||
|
<field name="date_scope">strict_range</field>
|
||||||
|
<field name="subformula">cross_report</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_return_investment0" model="account.report.line">
|
||||||
|
<field name="name">Return on investments (net profit / assets)</field>
|
||||||
|
<field name="code">ROI</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_executivesummary_return_investment0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">ROI.nep / ROI.ta * 100</field>
|
||||||
|
<field name="subformula">ignore_zero_division</field>
|
||||||
|
<field name="figure_type">percentage</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_return_investment0_nep" model="account.report.expression">
|
||||||
|
<field name="label">nep</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">NEP.balance</field>
|
||||||
|
<field name="date_scope">strict_range</field>
|
||||||
|
<field name="subformula">cross_report</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_return_investment0_ta" model="account.report.expression">
|
||||||
|
<field name="label">ta</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">TA.balance</field>
|
||||||
|
<field name="date_scope">from_beginning</field>
|
||||||
|
<field name="subformula">cross_report</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_position0" model="account.report.line">
|
||||||
|
<field name="name">Position</field>
|
||||||
|
<field name="hierarchy_level">0</field>
|
||||||
|
<field name="children_ids">
|
||||||
|
<record id="account_financial_report_executivesummary_avdebt0" model="account.report.line">
|
||||||
|
<field name="name">Average debtors days</field>
|
||||||
|
<field name="code">AVG_DEBT_DAYS</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_executivesummary_avdebt0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">DEB.balance / AVG_DEBT_DAYS.opinc * AVG_DEBT_DAYS.NDays</field>
|
||||||
|
<field name="subformula">ignore_zero_division</field>
|
||||||
|
<field name="green_on_positive" eval="False"/>
|
||||||
|
<field name="figure_type">float</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_avdebt0_opinc" model="account.report.expression">
|
||||||
|
<field name="label">opinc</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">REV.balance</field>
|
||||||
|
<field name="date_scope">strict_range</field>
|
||||||
|
<field name="subformula">cross_report</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_avdebt0_ndays" model="account.report.expression">
|
||||||
|
<field name="label">NDays</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_executive_summary_ndays</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_avgcre0" model="account.report.line">
|
||||||
|
<field name="name">Average creditors days</field>
|
||||||
|
<field name="code">AVG_CRED_DAYS</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_executivesummary_avgcre0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">-CRE.balance / (AVG_CRED_DAYS.cos + AVG_CRED_DAYS.exp) * AVG_CRED_DAYS.NDays</field>
|
||||||
|
<field name="subformula">ignore_zero_division</field>
|
||||||
|
<field name="green_on_positive" eval="False"/>
|
||||||
|
<field name="figure_type">float</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_avgcre0_cos" model="account.report.expression">
|
||||||
|
<field name="label">cos</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">COS.balance</field>
|
||||||
|
<field name="date_scope">strict_range</field>
|
||||||
|
<field name="subformula">cross_report</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_avgcre0_exp" model="account.report.expression">
|
||||||
|
<field name="label">exp</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">EXP.balance</field>
|
||||||
|
<field name="date_scope">strict_range</field>
|
||||||
|
<field name="subformula">cross_report</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_avgcre0_ndays" model="account.report.expression">
|
||||||
|
<field name="label">NDays</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_executive_summary_ndays</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_st_cash_forecast0" model="account.report.line">
|
||||||
|
<field name="name">Short term cash forecast</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_executivesummary_st_cash_forecast0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">DEB.balance + CRE.balance</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_ca_to_l0" model="account.report.line">
|
||||||
|
<field name="name">Current assets to liabilities</field>
|
||||||
|
<field name="code">CATL</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_executivesummary_ca_to_l0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">CATL.ca / CATL.cl</field>
|
||||||
|
<field name="subformula">ignore_zero_division</field>
|
||||||
|
<field name="figure_type">float</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_ca_to_l0_ca" model="account.report.expression">
|
||||||
|
<field name="label">ca</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">CA.balance</field>
|
||||||
|
<field name="date_scope">from_beginning</field>
|
||||||
|
<field name="subformula">cross_report</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_executivesummary_ca_to_l0_cl" model="account.report.expression">
|
||||||
|
<field name="label">cl</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">CL.balance</field>
|
||||||
|
<field name="date_scope">from_beginning</field>
|
||||||
|
<field name="subformula">cross_report</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</odoo>
|
||||||
117
Fusion Accounting/data/followup_data.xml
Normal file
117
Fusion Accounting/data/followup_data.xml
Normal file
@@ -0,0 +1,117 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<odoo noupdate="1">
|
||||||
|
|
||||||
|
<!-- ============================================================ -->
|
||||||
|
<!-- DEFAULT FOLLOW-UP LEVELS -->
|
||||||
|
<!-- ============================================================ -->
|
||||||
|
|
||||||
|
<record id="fusion_followup_level_1" model="fusion.followup.level">
|
||||||
|
<field name="name">First Reminder</field>
|
||||||
|
<field name="sequence">10</field>
|
||||||
|
<field name="delay">15</field>
|
||||||
|
<field name="send_email" eval="True"/>
|
||||||
|
<field name="send_sms" eval="False"/>
|
||||||
|
<field name="send_letter" eval="False"/>
|
||||||
|
<field name="join_invoices" eval="True"/>
|
||||||
|
<field name="description" type="html">
|
||||||
|
<p>Dear Customer,</p>
|
||||||
|
<p>We notice that your account has an outstanding balance past the due date.
|
||||||
|
We kindly ask you to settle the amount at your earliest convenience.</p>
|
||||||
|
<p>If payment has already been made, please disregard this notice.</p>
|
||||||
|
<p>Best regards</p>
|
||||||
|
</field>
|
||||||
|
<field name="company_id" ref="base.main_company"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="fusion_followup_level_2" model="fusion.followup.level">
|
||||||
|
<field name="name">Second Reminder</field>
|
||||||
|
<field name="sequence">20</field>
|
||||||
|
<field name="delay">30</field>
|
||||||
|
<field name="send_email" eval="True"/>
|
||||||
|
<field name="send_sms" eval="False"/>
|
||||||
|
<field name="send_letter" eval="True"/>
|
||||||
|
<field name="join_invoices" eval="True"/>
|
||||||
|
<field name="description" type="html">
|
||||||
|
<p>Dear Customer,</p>
|
||||||
|
<p>Despite our previous reminder, your account still has an overdue balance.
|
||||||
|
We urge you to arrange payment promptly to avoid further action.</p>
|
||||||
|
<p>Please contact us immediately if there is a dispute regarding the invoice.</p>
|
||||||
|
<p>Best regards</p>
|
||||||
|
</field>
|
||||||
|
<field name="company_id" ref="base.main_company"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="fusion_followup_level_3" model="fusion.followup.level">
|
||||||
|
<field name="name">Final Notice</field>
|
||||||
|
<field name="sequence">30</field>
|
||||||
|
<field name="delay">45</field>
|
||||||
|
<field name="send_email" eval="True"/>
|
||||||
|
<field name="send_sms" eval="True"/>
|
||||||
|
<field name="send_letter" eval="True"/>
|
||||||
|
<field name="join_invoices" eval="True"/>
|
||||||
|
<field name="description" type="html">
|
||||||
|
<p>Dear Customer,</p>
|
||||||
|
<p>This is our final notice regarding your overdue account balance.
|
||||||
|
Immediate payment is required. Failure to remit payment may result
|
||||||
|
in suspension of services and further collection measures.</p>
|
||||||
|
<p>Please contact our accounting department without delay.</p>
|
||||||
|
<p>Best regards</p>
|
||||||
|
</field>
|
||||||
|
<field name="company_id" ref="base.main_company"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<!-- ============================================================ -->
|
||||||
|
<!-- DEFAULT EMAIL TEMPLATE -->
|
||||||
|
<!-- ============================================================ -->
|
||||||
|
|
||||||
|
<record id="email_template_fusion_followup_default" model="mail.template">
|
||||||
|
<field name="name">Fusion: Payment Follow-up</field>
|
||||||
|
<field name="model_id" ref="base.model_res_partner"/>
|
||||||
|
<field name="email_from">{{ (object.company_id.email or user.email_formatted) }}</field>
|
||||||
|
<field name="subject">{{ object.company_id.name }} - Payment Reminder for {{ object.name }}</field>
|
||||||
|
<field name="body_html" type="html">
|
||||||
|
<div style="margin: 0; padding: 0; font-family: Arial, sans-serif;">
|
||||||
|
<p style="margin: 0 0 12px 0;">
|
||||||
|
Dear <t t-out="object.name or 'Customer'"/>,
|
||||||
|
</p>
|
||||||
|
<p style="margin: 0 0 12px 0;">
|
||||||
|
We are writing to remind you that your account with
|
||||||
|
<strong><t t-out="object.company_id.name or ''"/></strong>
|
||||||
|
has an outstanding balance that is past due.
|
||||||
|
</p>
|
||||||
|
<p style="margin: 0 0 12px 0;">
|
||||||
|
Please arrange payment at your earliest convenience. If you have
|
||||||
|
already sent payment, please disregard this notice and accept
|
||||||
|
our thanks.
|
||||||
|
</p>
|
||||||
|
<p style="margin: 0 0 12px 0;">
|
||||||
|
Should you have any questions or wish to discuss payment
|
||||||
|
arrangements, please do not hesitate to contact us.
|
||||||
|
</p>
|
||||||
|
<p style="margin: 0 0 4px 0;">Best regards,</p>
|
||||||
|
<p style="margin: 0;">
|
||||||
|
<t t-out="user.name or ''"/><br/>
|
||||||
|
<t t-out="object.company_id.name or ''"/>
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</field>
|
||||||
|
<field name="lang">{{ object.lang }}</field>
|
||||||
|
<field name="auto_delete" eval="False"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<!-- ============================================================ -->
|
||||||
|
<!-- SCHEDULED ACTION (ir.cron) -->
|
||||||
|
<!-- ============================================================ -->
|
||||||
|
|
||||||
|
<record id="cron_fusion_followup_check" model="ir.cron">
|
||||||
|
<field name="name">Fusion: Payment Follow-up Check</field>
|
||||||
|
<field name="model_id" ref="base.model_res_partner"/>
|
||||||
|
<field name="state">code</field>
|
||||||
|
<field name="code">model.compute_partners_needing_followup()</field>
|
||||||
|
<field name="interval_number">1</field>
|
||||||
|
<field name="interval_type">days</field>
|
||||||
|
|
||||||
|
<field name="active" eval="True"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
</odoo>
|
||||||
32
Fusion Accounting/data/fusion_accounting_data.xml
Normal file
32
Fusion Accounting/data/fusion_accounting_data.xml
Normal file
@@ -0,0 +1,32 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<odoo>
|
||||||
|
<!-- Switch root menu "Invoicing" to "Accounting" -->
|
||||||
|
<!-- Top menu item -->
|
||||||
|
<menuitem name="Accounting"
|
||||||
|
id="menu_accounting"
|
||||||
|
groups="account.group_account_readonly,account.group_account_invoice"
|
||||||
|
web_icon="fusion_accounting,static/description/icon.png"
|
||||||
|
sequence="60"/>
|
||||||
|
<!-- move existing submenus to point to the new parent -->
|
||||||
|
<record id="account.menu_finance_receivables" model="ir.ui.menu">
|
||||||
|
<field name="parent_id" ref="menu_accounting"/>
|
||||||
|
</record>
|
||||||
|
<record id="account.menu_finance_payables" model="ir.ui.menu">
|
||||||
|
<field name="parent_id" ref="menu_accounting"/>
|
||||||
|
</record>
|
||||||
|
<record id="account.menu_finance_entries" model="ir.ui.menu">
|
||||||
|
<field name="parent_id" ref="menu_accounting"/>
|
||||||
|
</record>
|
||||||
|
<record id="account.menu_finance_reports" model="ir.ui.menu">
|
||||||
|
<field name="parent_id" ref="menu_accounting"/>
|
||||||
|
</record>
|
||||||
|
<record id="account.menu_finance_configuration" model="ir.ui.menu">
|
||||||
|
<field name="parent_id" ref="menu_accounting"/>
|
||||||
|
</record>
|
||||||
|
<record id="account.menu_board_journal_1" model="ir.ui.menu">
|
||||||
|
<field name="parent_id" ref="menu_accounting"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<menuitem id="account.menu_account_config" name="Settings" parent="account.menu_finance_configuration" sequence="0" groups="base.group_system"/>
|
||||||
|
|
||||||
|
</odoo>
|
||||||
11
Fusion Accounting/data/fusion_accounting_tour.xml
Normal file
11
Fusion Accounting/data/fusion_accounting_tour.xml
Normal file
@@ -0,0 +1,11 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<odoo>
|
||||||
|
<record id="fusion_accounting_tour" model="web_tour.tour">
|
||||||
|
<field name="name">fusion_accounting_tour</field>
|
||||||
|
<field name="sequence">50</field>
|
||||||
|
<field name="rainbow_man_message"><![CDATA[
|
||||||
|
<strong><b>Good job!</b> You went through all steps of this tour.</strong>
|
||||||
|
<br>See how to manage your customer invoices in the <b>Customers/Invoices</b> menu
|
||||||
|
]]></field>
|
||||||
|
</record>
|
||||||
|
</odoo>
|
||||||
49
Fusion Accounting/data/general_ledger.xml
Normal file
49
Fusion Accounting/data/general_ledger.xml
Normal file
@@ -0,0 +1,49 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<odoo>
|
||||||
|
<record id="general_ledger_report" model="account.report">
|
||||||
|
<field name="name">General Ledger</field>
|
||||||
|
<field name="filter_journals" eval="True"/>
|
||||||
|
<field name="filter_analytic" eval="True"/>
|
||||||
|
<field name="filter_period_comparison" eval="False"/>
|
||||||
|
<field name="filter_multi_company">selector</field>
|
||||||
|
<field name="filter_unfold_all" eval="True"/>
|
||||||
|
<field name="filter_hide_0_lines">never</field>
|
||||||
|
<field name="default_opening_date_filter">this_month</field>
|
||||||
|
<field name="search_bar" eval="True"/>
|
||||||
|
<field name="load_more_limit" eval="80"/>
|
||||||
|
<field name="custom_handler_model_id" ref="model_account_general_ledger_report_handler"/>
|
||||||
|
<field name="column_ids">
|
||||||
|
<record id="general_ledger_report_date" model="account.report.column">
|
||||||
|
<field name="name">Date</field>
|
||||||
|
<field name="expression_label">date</field>
|
||||||
|
<field name="figure_type">date</field>
|
||||||
|
</record>
|
||||||
|
<record id="general_ledger_report_communication" model="account.report.column">
|
||||||
|
<field name="name">Communication</field>
|
||||||
|
<field name="expression_label">communication</field>
|
||||||
|
<field name="figure_type">string</field>
|
||||||
|
</record>
|
||||||
|
<record id="general_ledger_report_partner_name" model="account.report.column">
|
||||||
|
<field name="name">Partner</field>
|
||||||
|
<field name="expression_label">partner_name</field>
|
||||||
|
<field name="figure_type">string</field>
|
||||||
|
</record>
|
||||||
|
<record id="general_ledger_report_amount_currency" model="account.report.column">
|
||||||
|
<field name="name">Currency</field>
|
||||||
|
<field name="expression_label">amount_currency</field>
|
||||||
|
</record>
|
||||||
|
<record id="general_ledger_report_debit" model="account.report.column">
|
||||||
|
<field name="name">Debit</field>
|
||||||
|
<field name="expression_label">debit</field>
|
||||||
|
</record>
|
||||||
|
<record id="general_ledger_report_credit" model="account.report.column">
|
||||||
|
<field name="name">Credit</field>
|
||||||
|
<field name="expression_label">credit</field>
|
||||||
|
</record>
|
||||||
|
<record id="general_ledger_report_balance" model="account.report.column">
|
||||||
|
<field name="name">Balance</field>
|
||||||
|
<field name="expression_label">balance</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</odoo>
|
||||||
14
Fusion Accounting/data/generic_tax_report.xml
Normal file
14
Fusion Accounting/data/generic_tax_report.xml
Normal file
@@ -0,0 +1,14 @@
|
|||||||
|
<?xml version="1.0" encoding="UTF-8" ?>
|
||||||
|
<odoo>
|
||||||
|
<record id="account.generic_tax_report" model="account.report">
|
||||||
|
<field name="custom_handler_model_id" ref="model_account_generic_tax_report_handler"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="account.generic_tax_report_account_tax" model="account.report">
|
||||||
|
<field name="custom_handler_model_id" ref="model_account_generic_tax_report_handler_account_tax"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="account.generic_tax_report_tax_account" model="account.report">
|
||||||
|
<field name="custom_handler_model_id" ref="model_account_generic_tax_report_handler_tax_account"/>
|
||||||
|
</record>
|
||||||
|
</odoo>
|
||||||
11
Fusion Accounting/data/ir_cron.xml
Normal file
11
Fusion Accounting/data/ir_cron.xml
Normal file
@@ -0,0 +1,11 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<odoo>
|
||||||
|
<record id="auto_reconcile_bank_statement_line" model="ir.cron">
|
||||||
|
<field name="name">Try to reconcile automatically your statement lines</field>
|
||||||
|
<field name="model_id" ref="model_account_bank_statement_line"/>
|
||||||
|
<field name="state">code</field>
|
||||||
|
<field name="code">model._cron_try_auto_reconcile_statement_lines(batch_size=100)</field>
|
||||||
|
<field name='interval_number'>1</field>
|
||||||
|
<field name='interval_type'>days</field>
|
||||||
|
</record>
|
||||||
|
</odoo>
|
||||||
235
Fusion Accounting/data/journal_report.xml
Normal file
235
Fusion Accounting/data/journal_report.xml
Normal file
@@ -0,0 +1,235 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<odoo>
|
||||||
|
<record id="journal_report" model="account.report">
|
||||||
|
<field name="name">Journal Report</field>
|
||||||
|
<field name="filter_journals" eval="True"/>
|
||||||
|
<field name="filter_show_draft" eval="True"/>
|
||||||
|
<field name="filter_unfold_all" eval="True"/>
|
||||||
|
<field name="filter_hierarchy">never</field>
|
||||||
|
<field name="filter_period_comparison" eval="False"/>
|
||||||
|
<field name="filter_unreconciled" eval="False"/>
|
||||||
|
<field name="filter_hide_0_lines">never</field>
|
||||||
|
<field name="default_opening_date_filter">this_year</field>
|
||||||
|
<field name="custom_handler_model_id" ref="model_account_journal_report_handler"/>
|
||||||
|
<field name="column_ids">
|
||||||
|
<record id="journal_report_code" model="account.report.column">
|
||||||
|
<field name="name">Code</field>
|
||||||
|
<field name="expression_label">code</field>
|
||||||
|
<field name="figure_type">string</field>
|
||||||
|
</record>
|
||||||
|
<record id="journal_report_debit" model="account.report.column">
|
||||||
|
<field name="name">Debit</field>
|
||||||
|
<field name="expression_label">debit</field>
|
||||||
|
</record>
|
||||||
|
<record id="journal_report_credit" model="account.report.column">
|
||||||
|
<field name="name">Credit</field>
|
||||||
|
<field name="expression_label">credit</field>
|
||||||
|
</record>
|
||||||
|
<record id="journal_report_balance" model="account.report.column">
|
||||||
|
<field name="name">Balance</field>
|
||||||
|
<field name="expression_label">balance</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
<field name="line_ids">
|
||||||
|
<record id="journal_report_line" model="account.report.line">
|
||||||
|
<field name="name">Name</field>
|
||||||
|
<field name="groupby">journal_id, account_id</field>
|
||||||
|
<field name="hierarchy_level">0</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="journal_report_line_code" model="account.report.expression">
|
||||||
|
<field name="label">code</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_journal_report</field>
|
||||||
|
<field name="subformula">code</field>
|
||||||
|
</record>
|
||||||
|
<record id="journal_report_line_debit" model="account.report.expression">
|
||||||
|
<field name="label">debit</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_journal_report</field>
|
||||||
|
<field name="subformula">debit</field>
|
||||||
|
</record>
|
||||||
|
<record id="journal_report_line_credit" model="account.report.expression">
|
||||||
|
<field name="label">credit</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_journal_report</field>
|
||||||
|
<field name="subformula">credit</field>
|
||||||
|
</record>
|
||||||
|
<record id="journal_report_line_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_journal_report</field>
|
||||||
|
<field name="subformula">balance</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<template id="journal_report_pdf_export_main">
|
||||||
|
<html>
|
||||||
|
<head>
|
||||||
|
<base t-att-href="base_url"/>
|
||||||
|
<meta http-equiv="content-type" content="text/html; charset=utf-8"/>
|
||||||
|
<t t-call-assets="fusion_accounting.assets_pdf_export" t-js="False"/>
|
||||||
|
</head>
|
||||||
|
<body t-att-dir="env['res.lang']._get_data(code=lang or env.user.lang).direction or 'ltr'">
|
||||||
|
<div t-att-class="options['css_custom_class']">
|
||||||
|
<header>
|
||||||
|
<div class="row align-items-center">
|
||||||
|
<div class="col-4 o_header_font">
|
||||||
|
<t t-call="fusion_accounting.company_information"/>
|
||||||
|
</div>
|
||||||
|
<div class="col-4">
|
||||||
|
<div class="o_title">
|
||||||
|
<t t-if="report.filter_show_draft and options['all_entries']">[Draft]</t>
|
||||||
|
<t t-out="report.name"/>
|
||||||
|
</div>
|
||||||
|
<div class="o_subtitle">
|
||||||
|
<t t-out="options['date']['date_from']"/> - <t t-out="options['date']['date_to']"/>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</header>
|
||||||
|
|
||||||
|
<!-- Journal entries -->
|
||||||
|
<t t-foreach="document_data['journals_vals']" t-as="journal_vals">
|
||||||
|
<section style="page-break-after: always;">
|
||||||
|
<div class="o_section_title">
|
||||||
|
<t t-out="journal_vals.get('name')"/>
|
||||||
|
</div>
|
||||||
|
<div class="d-flex align-items-start">
|
||||||
|
<t t-call="fusion_accounting.journal_report_pdf_body_default"/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<t t-if="journal_vals.get('tax_summary')">
|
||||||
|
<t t-call="fusion_accounting.pdf_journal_report_taxes_summary">
|
||||||
|
<t t-set="tax_summary" t-value="journal_vals['tax_summary']"/>
|
||||||
|
</t>
|
||||||
|
</t>
|
||||||
|
</section>
|
||||||
|
</t>
|
||||||
|
|
||||||
|
<section t-if="document_data.get('global_tax_summary')">
|
||||||
|
<div class="o_section_title">
|
||||||
|
Global Tax Summary
|
||||||
|
</div>
|
||||||
|
<t t-call="fusion_accounting.pdf_journal_report_taxes_summary">
|
||||||
|
<t t-set="tax_summary" t-value="document_data['global_tax_summary']"/>
|
||||||
|
</t>
|
||||||
|
</section>
|
||||||
|
</div>
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
|
</template>
|
||||||
|
|
||||||
|
<template id="journal_report_pdf_body_default">
|
||||||
|
<table class="o_table">
|
||||||
|
<thead>
|
||||||
|
<tr>
|
||||||
|
<t t-foreach="journal_vals['columns']" t-as="column">
|
||||||
|
<th t-att-class="column.get('class', '')">
|
||||||
|
<t t-out="column.get('name', '')"/>
|
||||||
|
</th>
|
||||||
|
</t>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
|
||||||
|
<tbody>
|
||||||
|
<t t-foreach="journal_vals['lines']" t-as="line">
|
||||||
|
<tr t-att-class="line.get('line_class', '')">
|
||||||
|
<t t-foreach="journal_vals['columns']" t-as="column">
|
||||||
|
<t t-if="line.get(column['label'])">
|
||||||
|
<t t-set="cell_style" t-value="line[column['label']].get('class', '')"/>
|
||||||
|
<t t-set="column_style" t-value="column.get('class', '')"/>
|
||||||
|
<td t-att-class="cell_style + column_style">
|
||||||
|
<t t-out="line[column['label']]['data']"/>
|
||||||
|
</td>
|
||||||
|
</t>
|
||||||
|
<t t-else="">
|
||||||
|
<td/>
|
||||||
|
</t>
|
||||||
|
</t>
|
||||||
|
</tr>
|
||||||
|
</t>
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
</template>
|
||||||
|
|
||||||
|
<template id="pdf_journal_report_taxes_summary">
|
||||||
|
<div class="container tax_summary" style="page-break-inside: avoid;">
|
||||||
|
<t t-set="taxes" t-value="tax_summary.get('tax_report_lines')"/>
|
||||||
|
<t t-if="taxes">
|
||||||
|
<div class="row o_section_subtitle">
|
||||||
|
<p>Tax Applied</p>
|
||||||
|
</div>
|
||||||
|
<div class="row taxes">
|
||||||
|
<t t-set="extra_columns" t-value="tax_summary.get('extra_columns')"/>
|
||||||
|
<table class="o_table">
|
||||||
|
<thead>
|
||||||
|
<tr>
|
||||||
|
<th t-if="len(taxes) > 1">Country</th>
|
||||||
|
<th>Name</th>
|
||||||
|
<th class="o_right_alignment">Base Amount</th>
|
||||||
|
<th class="o_right_alignment">Tax Amount</th>
|
||||||
|
<th t-if="tax_summary.get('tax_non_deductible_column')" class="o_right_alignment">Non-Deductible</th>
|
||||||
|
<th t-if="tax_summary.get('tax_deductible_column')" class="o_right_alignment">Deductible</th>
|
||||||
|
<th t-if="tax_summary.get('tax_due_column')" class="o_right_alignment">Due</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody>
|
||||||
|
<t t-foreach="taxes" t-as="country_name">
|
||||||
|
<tr t-foreach="taxes[country_name]" t-as="tax">
|
||||||
|
<t t-if="country_name_size > 1">
|
||||||
|
<td>
|
||||||
|
<t t-if="tax_index == 0" t-out="country_name"/>
|
||||||
|
</td>
|
||||||
|
</t>
|
||||||
|
<td t-out="tax['name']"/>
|
||||||
|
<td class="o_right_alignment" t-out="tax['base_amount']"/>
|
||||||
|
<td class="o_right_alignment" t-out="tax['tax_amount']"/>
|
||||||
|
<td t-if="tax_summary.get('tax_non_deductible_column')" class="o_right_alignment" t-out="tax['tax_non_deductible']"/>
|
||||||
|
<td t-if="tax_summary.get('tax_deductible_column')" class="o_right_alignment" t-out="tax['tax_deductible']"/>
|
||||||
|
<td t-if="tax_summary.get('tax_due_column')" class="o_right_alignment" t-out="tax['tax_due']"/>
|
||||||
|
</tr>
|
||||||
|
</t>
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
</t>
|
||||||
|
<t t-set="grids" t-value="tax_summary.get('tax_grid_summary_lines')"/>
|
||||||
|
<t t-if="grids">
|
||||||
|
<div class="row o_section_subtitle">
|
||||||
|
<p>Impacted Tax Grids</p>
|
||||||
|
</div>
|
||||||
|
<div class="row tax_grid">
|
||||||
|
<table class="o_table">
|
||||||
|
<thead>
|
||||||
|
<tr>
|
||||||
|
<th t-if="len(grids) > 1">Country</th>
|
||||||
|
<th>Grid</th>
|
||||||
|
<th class="o_right_alignment">+</th>
|
||||||
|
<th class="o_right_alignment">-</th>
|
||||||
|
<th class="o_right_alignment">Impact On Grid</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody>
|
||||||
|
<t t-foreach="grids" t-as="country_name">
|
||||||
|
<tr t-foreach="grids[country_name]" t-as="grid_name">
|
||||||
|
<t t-if="country_name_size > 1">
|
||||||
|
<td>
|
||||||
|
<t t-if="grid_name_index == 0" t-out="country_name"/>
|
||||||
|
</td>
|
||||||
|
</t>
|
||||||
|
<td t-out="grid_name"/>
|
||||||
|
<td class="o_right_alignment" t-out="grids[country_name][grid_name].get('+', 0)"/>
|
||||||
|
<td class="o_right_alignment" t-out="grids[country_name][grid_name].get('-', 0)"/>
|
||||||
|
<td class="o_right_alignment" t-out="grids[country_name][grid_name]['impact']"/>
|
||||||
|
</tr>
|
||||||
|
</t>
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
</t>
|
||||||
|
</div>
|
||||||
|
</template>
|
||||||
|
</odoo>
|
||||||
29
Fusion Accounting/data/loan_data.xml
Normal file
29
Fusion Accounting/data/loan_data.xml
Normal file
@@ -0,0 +1,29 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<odoo noupdate="1">
|
||||||
|
|
||||||
|
<!-- ============================================================ -->
|
||||||
|
<!-- SEQUENCE: Loan Reference -->
|
||||||
|
<!-- ============================================================ -->
|
||||||
|
<record id="seq_fusion_loan" model="ir.sequence">
|
||||||
|
<field name="name">Fusion Loan</field>
|
||||||
|
<field name="code">fusion.loan</field>
|
||||||
|
<field name="prefix">LOAN/%(year)s/</field>
|
||||||
|
<field name="padding">4</field>
|
||||||
|
<field name="company_id" eval="False"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<!-- ============================================================ -->
|
||||||
|
<!-- SCHEDULED ACTION: Auto-generate Loan Entries -->
|
||||||
|
<!-- ============================================================ -->
|
||||||
|
<record id="ir_cron_generate_loan_entries" model="ir.cron">
|
||||||
|
<field name="name">Generate Loan Installment Entries</field>
|
||||||
|
<field name="model_id" ref="model_fusion_loan"/>
|
||||||
|
<field name="state">code</field>
|
||||||
|
<field name="code">model._cron_generate_loan_entries()</field>
|
||||||
|
<field name="interval_number">1</field>
|
||||||
|
<field name="interval_type">days</field>
|
||||||
|
|
||||||
|
<field name="active">True</field>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
</odoo>
|
||||||
34
Fusion Accounting/data/mail_activity_type_data.xml
Normal file
34
Fusion Accounting/data/mail_activity_type_data.xml
Normal file
@@ -0,0 +1,34 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<odoo>
|
||||||
|
<data>
|
||||||
|
<record id="tax_closing_activity_type" model="mail.activity.type">
|
||||||
|
<field name="name">Tax Report</field>
|
||||||
|
<field name="summary">Tax Report</field>
|
||||||
|
<field name="category">tax_report</field>
|
||||||
|
<field name="res_model">account.journal</field>
|
||||||
|
<field name="chaining_type">suggest</field>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="mail_activity_type_tax_report_to_pay" model="mail.activity.type">
|
||||||
|
<field name="name">Pay Tax</field>
|
||||||
|
<field name="summary">Tax is ready to be paid</field>
|
||||||
|
<field name="category">tax_report</field>
|
||||||
|
<field name="delay_count">0</field>
|
||||||
|
<field name="delay_unit">days</field>
|
||||||
|
<field name="delay_from">previous_activity</field>
|
||||||
|
<field name="res_model">account.move</field>
|
||||||
|
<field name="chaining_type">suggest</field>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="mail_activity_type_tax_report_to_be_sent" model="mail.activity.type">
|
||||||
|
<field name="name">Tax Report Ready</field>
|
||||||
|
<field name="summary">Tax report is ready to be sent to the administration</field>
|
||||||
|
<field name="category">tax_report</field>
|
||||||
|
<field name="delay_count">0</field>
|
||||||
|
<field name="delay_unit">days</field>
|
||||||
|
<field name="delay_from">current_date</field>
|
||||||
|
<field name="res_model">account.move</field>
|
||||||
|
<field name="chaining_type">suggest</field>
|
||||||
|
</record>
|
||||||
|
</data>
|
||||||
|
</odoo>
|
||||||
26
Fusion Accounting/data/mail_templates.xml
Normal file
26
Fusion Accounting/data/mail_templates.xml
Normal file
@@ -0,0 +1,26 @@
|
|||||||
|
<odoo>
|
||||||
|
<record id="email_template_customer_statement" model="mail.template">
|
||||||
|
<field name="name">Customer Statement</field>
|
||||||
|
<field name="model_id" ref="base.model_res_partner"/>
|
||||||
|
<field name="email_from">{{ object._get_followup_responsible().email_formatted }}</field>
|
||||||
|
<field name="subject">{{ (object.company_id or object._get_followup_responsible().company_id).name }} Statement - {{ object.commercial_company_name }}</field>
|
||||||
|
<field name="body_html" type="html">
|
||||||
|
<div style="margin: 0px; padding: 0px;">
|
||||||
|
<p style="margin: 0px; padding: 0px;">
|
||||||
|
<t t-if="object.id != object.commercial_partner_id.id">Dear <t t-out="object.name or ''"/> (<t t-out="object.commercial_partner_id.name or ''"/>),</t>
|
||||||
|
<t t-else="">Dear <t t-out="object.name or ''"/>,</t>
|
||||||
|
<br/>
|
||||||
|
Please find enclosed the statement of your account.
|
||||||
|
<br/>
|
||||||
|
Do not hesitate to contact us if you have any questions.
|
||||||
|
<br/>
|
||||||
|
Sincerely,
|
||||||
|
<br/>
|
||||||
|
<t t-out="object._get_followup_responsible().name if is_html_empty(object._get_followup_responsible().signature) else object._get_followup_responsible().signature"/>
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</field>
|
||||||
|
<field name="lang">{{ object.lang }}</field>
|
||||||
|
<field name="auto_delete" eval="False"/>
|
||||||
|
</record>
|
||||||
|
</odoo>
|
||||||
53
Fusion Accounting/data/menuitems.xml
Normal file
53
Fusion Accounting/data/menuitems.xml
Normal file
@@ -0,0 +1,53 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<odoo>
|
||||||
|
<menuitem id="menu_action_account_report_partner_ledger" name="Partner Ledger"
|
||||||
|
action="action_account_report_partner_ledger" groups="account.group_account_readonly"
|
||||||
|
parent="account.account_reports_partners_reports_menu"/>
|
||||||
|
<menuitem id="menu_action_account_report_aged_receivable" name="Aged Receivable" action="action_account_report_ar"
|
||||||
|
groups="account.group_account_readonly" parent="account.account_reports_partners_reports_menu"/>
|
||||||
|
<menuitem id="menu_action_account_report_aged_payable" name="Aged Payable" action="action_account_report_ap"
|
||||||
|
groups="account.group_account_readonly" parent="account.account_reports_partners_reports_menu"/>
|
||||||
|
<menuitem id="account_reports_audit_reports_menu" name="Audit Reports" parent="account.menu_finance_reports"
|
||||||
|
sequence="2">
|
||||||
|
<menuitem id="menu_action_account_report_general_ledger" name="General Ledger"
|
||||||
|
action="action_account_report_general_ledger" groups="account.group_account_readonly"/>
|
||||||
|
<menuitem id="menu_action_account_report_coa" name="Trial Balance" action="action_account_report_coa"
|
||||||
|
groups="account.group_account_readonly"/>
|
||||||
|
<menuitem id="menu_action_account_report_ja" name="Journal Audit" action="action_account_report_ja"
|
||||||
|
groups="account.group_account_readonly"/>
|
||||||
|
</menuitem>
|
||||||
|
|
||||||
|
<menuitem id="menu_action_account_report_gt" name="Tax Return" action="action_account_report_gt"
|
||||||
|
parent="account.account_reports_legal_statements_menu" sequence="50"
|
||||||
|
groups="account.group_account_readonly,account.group_account_basic"/>
|
||||||
|
<menuitem id="menu_action_account_report_sales" action="action_account_report_sales"
|
||||||
|
parent="account.account_reports_legal_statements_menu" sequence="60"
|
||||||
|
groups="account.group_account_readonly" active="False"/>
|
||||||
|
|
||||||
|
<menuitem id="menu_action_account_report_multicurrency_revaluation" name="Unrealized Currency Gains/Losses"
|
||||||
|
action="action_account_report_multicurrency_revaluation" parent="account.account_reports_management_menu"
|
||||||
|
groups="base.group_multi_currency"/>
|
||||||
|
<menuitem id="menu_action_account_report_balance_sheet" name="Balance Sheet" action="action_account_report_bs"
|
||||||
|
parent="account.account_reports_legal_statements_menu" groups="account.group_account_readonly"/>
|
||||||
|
<menuitem id="menu_action_account_report_profit_and_loss" name="Profit and Loss" action="action_account_report_pl"
|
||||||
|
parent="account.account_reports_legal_statements_menu" groups="account.group_account_readonly"/>
|
||||||
|
<menuitem id="menu_action_account_report_cash_flow" name="Cash Flow Statement" action="action_account_report_cs"
|
||||||
|
parent="account.account_reports_legal_statements_menu" groups="account.group_account_readonly"/>
|
||||||
|
<menuitem id="menu_action_account_report_exec_summary" name="Executive Summary"
|
||||||
|
action="action_account_report_exec_summary" parent="account.account_reports_legal_statements_menu"
|
||||||
|
groups="account.group_account_readonly"/>
|
||||||
|
<menuitem id="menu_action_account_report_deferred_expense" name="Deferred Expense"
|
||||||
|
action="action_account_report_deferred_expense" parent="account.account_reports_management_menu"
|
||||||
|
groups="account.group_account_readonly"/>
|
||||||
|
<menuitem id="menu_action_account_report_deferred_revenue" name="Deferred Revenue"
|
||||||
|
action="action_account_report_deferred_revenue" parent="account.account_reports_management_menu"
|
||||||
|
groups="account.group_account_readonly"/>
|
||||||
|
<menuitem id="menu_action_account_report_tree" name="Accounting Reports" sequence="6"
|
||||||
|
parent="account.account_account_menu" action="action_account_report_tree" groups="base.group_no_one"/>
|
||||||
|
<menuitem id="menu_action_account_report_horizontal_groups" name="Horizontal Groups"
|
||||||
|
action="action_account_report_horizontal_groups" parent="account.account_account_menu" sequence="10"
|
||||||
|
groups="base.group_no_one"/>
|
||||||
|
<menuitem id="menu_action_account_report_budget_tree" name="Financial Budgets"
|
||||||
|
action="action_account_report_budget_tree" parent="account.account_account_menu" sequence="11"/>
|
||||||
|
|
||||||
|
</odoo>
|
||||||
8
Fusion Accounting/data/menuitems_asset.xml
Normal file
8
Fusion Accounting/data/menuitems_asset.xml
Normal file
@@ -0,0 +1,8 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<odoo>
|
||||||
|
<menuitem id="menu_action_account_report_assets"
|
||||||
|
name="Depreciation Schedule"
|
||||||
|
action="action_account_report_assets"
|
||||||
|
parent="account.account_reports_management_menu"
|
||||||
|
groups="account.group_account_readonly"/>
|
||||||
|
</odoo>
|
||||||
107
Fusion Accounting/data/multicurrency_revaluation_report.xml
Normal file
107
Fusion Accounting/data/multicurrency_revaluation_report.xml
Normal file
@@ -0,0 +1,107 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<odoo>
|
||||||
|
<record id="multicurrency_revaluation_report" model="account.report">
|
||||||
|
<field name="name">Unrealized Currency Gains/Losses</field>
|
||||||
|
<field name="filter_date_range" eval="False"/>
|
||||||
|
<field name="filter_show_draft" eval="True"/>
|
||||||
|
<field name="default_opening_date_filter">previous_month</field>
|
||||||
|
<field name="custom_handler_model_id" ref="model_account_multicurrency_revaluation_report_handler"/>
|
||||||
|
<field name="column_ids">
|
||||||
|
<record id="multicurrency_revaluation_report_balance_currency" model="account.report.column">
|
||||||
|
<field name="name">Balance in Foreign Currency</field>
|
||||||
|
<field name="expression_label">balance_currency</field>
|
||||||
|
</record>
|
||||||
|
<record id="multicurrency_revaluation_report_balance_operation" model="account.report.column">
|
||||||
|
<field name="name">Balance at Operation Rate</field>
|
||||||
|
<field name="expression_label">balance_operation</field>
|
||||||
|
</record>
|
||||||
|
<record id="multicurrency_revaluation_report_balance_current" model="account.report.column">
|
||||||
|
<field name="name">Balance at Current Rate</field>
|
||||||
|
<field name="expression_label">balance_current</field>
|
||||||
|
</record>
|
||||||
|
<record id="multicurrency_revaluation_report_adjustment" model="account.report.column">
|
||||||
|
<field name="name">Adjustment</field>
|
||||||
|
<field name="expression_label">adjustment</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
<field name="line_ids">
|
||||||
|
<record id="multicurrency_revaluation_to_adjust" model="account.report.line">
|
||||||
|
<field name="name">Accounts To Adjust</field>
|
||||||
|
<field name="code">multicurrency_included</field>
|
||||||
|
<field name="groupby">currency_id, account_id, id</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="multicurrency_revaluation_to_adjust_balance_currency" model="account.report.expression">
|
||||||
|
<field name="label">balance_currency</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_multi_currency_revaluation_to_adjust</field>
|
||||||
|
<field name="subformula">balance_currency</field>
|
||||||
|
</record>
|
||||||
|
<record id="multicurrency_revaluation_to_adjust_balance_currency_forced_currency" model="account.report.expression">
|
||||||
|
<field name="label">_currency_balance_currency</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_multi_currency_revaluation_to_adjust</field>
|
||||||
|
<field name="subformula">currency_id</field>
|
||||||
|
</record>
|
||||||
|
<record id="multicurrency_revaluation_to_adjust_balance_operation" model="account.report.expression">
|
||||||
|
<field name="label">balance_operation</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_multi_currency_revaluation_to_adjust</field>
|
||||||
|
<field name="subformula">balance_operation</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="multicurrency_revaluation_to_adjust_balance_current" model="account.report.expression">
|
||||||
|
<field name="label">balance_current</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_multi_currency_revaluation_to_adjust</field>
|
||||||
|
<field name="subformula">balance_current</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
<record id="multicurrency_revaluation_to_adjust_adjustment" model="account.report.expression">
|
||||||
|
<field name="label">adjustment</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_multi_currency_revaluation_to_adjust</field>
|
||||||
|
<field name="subformula">adjustment</field>
|
||||||
|
<field name="auditable" eval="False"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="multicurrency_revaluation_excluded" model="account.report.line">
|
||||||
|
<field name="name">Excluded Accounts</field>
|
||||||
|
<field name="groupby">currency_id, account_id, id</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="multicurrency_revaluation_excluded_balance_currency" model="account.report.expression">
|
||||||
|
<field name="label">balance_currency</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_multi_currency_revaluation_excluded</field>
|
||||||
|
<field name="subformula">balance_currency</field>
|
||||||
|
</record>
|
||||||
|
<record id="multicurrency_revaluation_excluded_balance_currency_forced_currency" model="account.report.expression">
|
||||||
|
<field name="label">_currency_balance_currency</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_multi_currency_revaluation_excluded</field>
|
||||||
|
<field name="subformula">currency_id</field>
|
||||||
|
</record>
|
||||||
|
<record id="multicurrency_revaluation_excluded_balance_operation" model="account.report.expression">
|
||||||
|
<field name="label">balance_operation</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_multi_currency_revaluation_excluded</field>
|
||||||
|
<field name="subformula">balance_operation</field>
|
||||||
|
</record>
|
||||||
|
<record id="multicurrency_revaluation_excluded_balance_current" model="account.report.expression">
|
||||||
|
<field name="label">balance_current</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_multi_currency_revaluation_excluded</field>
|
||||||
|
<field name="subformula">balance_current</field>
|
||||||
|
</record>
|
||||||
|
<record id="multicurrency_revaluation_excluded_adjustment" model="account.report.expression">
|
||||||
|
<field name="label">adjustment</field>
|
||||||
|
<field name="engine">custom</field>
|
||||||
|
<field name="formula">_report_custom_engine_multi_currency_revaluation_excluded</field>
|
||||||
|
<field name="subformula">adjustment</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</odoo>
|
||||||
65
Fusion Accounting/data/partner_ledger.xml
Normal file
65
Fusion Accounting/data/partner_ledger.xml
Normal file
@@ -0,0 +1,65 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<odoo>
|
||||||
|
<record id="partner_ledger_report" model="account.report">
|
||||||
|
<field name="name">Partner Ledger</field>
|
||||||
|
<field name="filter_show_draft" eval="True"/>
|
||||||
|
<field name="filter_account_type">both</field>
|
||||||
|
<field name="filter_partner" eval="True"/>
|
||||||
|
<field name="filter_unfold_all" eval="True"/>
|
||||||
|
<field name="filter_unreconciled" eval="True"/>
|
||||||
|
<field name="filter_period_comparison" eval="False"/>
|
||||||
|
<field name="filter_multi_company">selector</field>
|
||||||
|
<field name="filter_hide_0_lines">never</field>
|
||||||
|
<field name="default_opening_date_filter">this_year</field>
|
||||||
|
<field name="search_bar" eval="True"/>
|
||||||
|
<field name="load_more_limit" eval="80"/>
|
||||||
|
<field name="custom_handler_model_id" ref="model_account_partner_ledger_report_handler"/>
|
||||||
|
<field name="column_ids">
|
||||||
|
<record id="partner_ledger_report_journal_code" model="account.report.column">
|
||||||
|
<field name="name">Journal</field>
|
||||||
|
<field name="expression_label">journal_code</field>
|
||||||
|
<field name="figure_type">string</field>
|
||||||
|
</record>
|
||||||
|
<record id="partner_ledger_report_account_code" model="account.report.column">
|
||||||
|
<field name="name">Account</field>
|
||||||
|
<field name="expression_label">account_code</field>
|
||||||
|
<field name="figure_type">string</field>
|
||||||
|
</record>
|
||||||
|
<record id="partner_ledger_report_invoicing_date" model="account.report.column">
|
||||||
|
<field name="name">Invoice Date</field>
|
||||||
|
<field name="expression_label">invoice_date</field>
|
||||||
|
<field name="figure_type">date</field>
|
||||||
|
</record>
|
||||||
|
<record id="partner_ledger_report_date_maturity" model="account.report.column">
|
||||||
|
<field name="name">Due Date</field>
|
||||||
|
<field name="expression_label">date_maturity</field>
|
||||||
|
<field name="figure_type">date</field>
|
||||||
|
</record>
|
||||||
|
<record id="partner_ledger_report_matching_number" model="account.report.column">
|
||||||
|
<field name="name">Matching</field>
|
||||||
|
<field name="expression_label">matching_number</field>
|
||||||
|
<field name="figure_type">string</field>
|
||||||
|
</record>
|
||||||
|
<record id="partner_ledger_report_debit" model="account.report.column">
|
||||||
|
<field name="name">Debit</field>
|
||||||
|
<field name="expression_label">debit</field>
|
||||||
|
</record>
|
||||||
|
<record id="partner_ledger_report_credit" model="account.report.column">
|
||||||
|
<field name="name">Credit</field>
|
||||||
|
<field name="expression_label">credit</field>
|
||||||
|
</record>
|
||||||
|
<record id="partner_ledger_amount" model="account.report.column">
|
||||||
|
<field name="name">Amount</field>
|
||||||
|
<field name="expression_label">amount</field>
|
||||||
|
</record>
|
||||||
|
<record id="partner_ledger_report_amount_currency" model="account.report.column">
|
||||||
|
<field name="name">Amount Currency</field>
|
||||||
|
<field name="expression_label">amount_currency</field>
|
||||||
|
</record>
|
||||||
|
<record id="partner_ledger_report_balance" model="account.report.column">
|
||||||
|
<field name="name">Balance</field>
|
||||||
|
<field name="expression_label">balance</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</odoo>
|
||||||
335
Fusion Accounting/data/pdf_export_templates.xml
Normal file
335
Fusion Accounting/data/pdf_export_templates.xml
Normal file
@@ -0,0 +1,335 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<odoo>
|
||||||
|
<template id="pdf_export_main">
|
||||||
|
<html>
|
||||||
|
<head>
|
||||||
|
<base t-att-href="base_url"/>
|
||||||
|
<meta http-equiv="content-type" content="text/html; charset=utf-8"/>
|
||||||
|
<t t-call-assets="fusion_accounting.assets_pdf_export" t-js="False"/>
|
||||||
|
</head>
|
||||||
|
<body t-att-dir="env['res.lang']._get_data(code=lang or env.user.lang).direction or 'ltr'">
|
||||||
|
<div t-att-class="'o_content ' + options['css_custom_class']">
|
||||||
|
<header>
|
||||||
|
<div class="o_title">
|
||||||
|
<t t-if="report.filter_show_draft and options['all_entries']">[Draft]</t>
|
||||||
|
<t t-out="report_title"/>
|
||||||
|
</div>
|
||||||
|
<div class="row o_header_font">
|
||||||
|
<div class="col-8">
|
||||||
|
<!-- All company information (name, address, vat, ...) -->
|
||||||
|
<t t-call="{{custom_templates.get('company_information', 'fusion_accounting.company_information')}}"/>
|
||||||
|
</div>
|
||||||
|
<div class="col-4">
|
||||||
|
<!-- All filters and options -->
|
||||||
|
<t t-call="{{custom_templates.get('pdf_export_filters', 'fusion_accounting.pdf_export_filters')}}"/>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</header>
|
||||||
|
|
||||||
|
<div class="d-flex align-items-start">
|
||||||
|
<t t-foreach="options.get('horizontal_split') and ['left', 'right'] or [None]" t-as="split_side">
|
||||||
|
<table t-attf-class="o_table #{options.get('horizontal_split') and 'horizontal_split_page'}">
|
||||||
|
<!-- Header -->
|
||||||
|
<t t-call="{{custom_templates.get('pdf_export_main_table_header', 'fusion_accounting.pdf_export_main_table_header')}}"/>
|
||||||
|
|
||||||
|
<!-- Body -->
|
||||||
|
<tbody>
|
||||||
|
<t t-if="lines">
|
||||||
|
<t t-call="{{custom_templates.get('pdf_export_main_table_body', 'fusion_accounting.pdf_export_main_table_body')}}">
|
||||||
|
<t t-set="lines" t-value="filter(lambda x: not split_side or split_side == x.get('horizontal_split_side', 'left'), lines)"/>
|
||||||
|
</t>
|
||||||
|
</t>
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
</t>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Annotations -->
|
||||||
|
<ol class="o_annotation">
|
||||||
|
<t t-foreach="annotations" t-as="annotation">
|
||||||
|
<li>
|
||||||
|
<t t-out="annotation.get('number')"/>.
|
||||||
|
<t t-if="annotation.get('date')"><t t-out="annotation['date']"/> -</t>
|
||||||
|
<t t-out="annotation.get('text')"/>
|
||||||
|
</li>
|
||||||
|
</t>
|
||||||
|
</ol>
|
||||||
|
</div>
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
|
</template>
|
||||||
|
|
||||||
|
<template id="company_information">
|
||||||
|
<t t-set="company_names" t-value="[company['name'] for company in options['companies']]"/>
|
||||||
|
<div class="row">
|
||||||
|
<div class="col-10" t-out="', '.join(company_names)"/>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<address class="mb-0 o_text_muted" t-field="env.company.partner_id" t-options='{"widget": "contact", "fields": ["address"], "no_marker": True}'/>
|
||||||
|
|
||||||
|
<t t-if="options.get('tax_unit', 'company_only') == 'company_only'">
|
||||||
|
<t t-if="env.company.account_fiscal_country_id.vat_label" t-out="env.company.account_fiscal_country_id.vat_label+':'"/>
|
||||||
|
<t t-else="">Tax ID:</t>
|
||||||
|
<t t-out="env.company.vat"/>
|
||||||
|
</t>
|
||||||
|
<t t-else="">
|
||||||
|
Tax ID: <t t-out="env['account.tax.unit'].browse(options.get('tax_unit')).vat"/>
|
||||||
|
</t>
|
||||||
|
</template>
|
||||||
|
|
||||||
|
<template id="pdf_export_filters">
|
||||||
|
<!-- Journals -->
|
||||||
|
<t t-if="options.get('journals')">
|
||||||
|
<div class="row" name="filter_info_template_journals">
|
||||||
|
<t t-set="journal_group_selected" t-value="options.get('selected_journal_groups')"/>
|
||||||
|
<t t-if="journal_group_selected">
|
||||||
|
<div class="col-3">Multi-Ledger: </div>
|
||||||
|
<div class="col-9 o_text_muted" t-out="journal_group_selected['title']"/>
|
||||||
|
</t>
|
||||||
|
<t t-else="">
|
||||||
|
<t t-set="journal_value" t-value="[journal.get('title') for journal in options['journals'] if journal.get('selected')]"/>
|
||||||
|
<t t-if="journal_value">
|
||||||
|
<div class="col-3">Journals: </div>
|
||||||
|
<div class="col-9 o_text_muted" t-out="', '.join(journal_value)"/>
|
||||||
|
</t>
|
||||||
|
</t>
|
||||||
|
</div>
|
||||||
|
</t>
|
||||||
|
|
||||||
|
<!-- Partners -->
|
||||||
|
<t t-if="options.get('partner_ids') != None">
|
||||||
|
<div class="row">
|
||||||
|
<t t-set="partner_value" t-value="[partner for partner in options['selected_partner_ids']]"/>
|
||||||
|
<t t-if="partner_value">
|
||||||
|
<div class="col-3">Partners:</div>
|
||||||
|
<div class="col-9 o_text_muted" t-out="', '.join(partner_value)"/>
|
||||||
|
</t>
|
||||||
|
</div>
|
||||||
|
</t>
|
||||||
|
|
||||||
|
<!-- Partners categories -->
|
||||||
|
<t t-if="options.get('partner_categories') != None">
|
||||||
|
<div class="row">
|
||||||
|
<t t-set="partner_category_value" t-value="[partner for partner in options['selected_partner_categories']]"/>
|
||||||
|
<t t-if="partner_category_value">
|
||||||
|
<div class="col-3">Partners Categories:</div>
|
||||||
|
<div class="col-9 o_text_muted" t-out="', '.join(partner_category_value)"/>
|
||||||
|
</t>
|
||||||
|
</div>
|
||||||
|
</t>
|
||||||
|
|
||||||
|
<!-- Horizontal -->
|
||||||
|
<t t-if="options.get('selected_horizontal_group_id')">
|
||||||
|
<div class="row">
|
||||||
|
<t t-set="horizontal_group" t-value="[hg['name'] for hg in options['available_horizontal_groups'] if hg['id'] == options.get('selected_horizontal_group_id')]"/>
|
||||||
|
<t t-if="horizontal_group">
|
||||||
|
<div class="col-3">Horizontal:</div>
|
||||||
|
<div class="col-9 o_text_muted" t-out="horizontal_group[0]"/>
|
||||||
|
</t>
|
||||||
|
</div>
|
||||||
|
</t>
|
||||||
|
|
||||||
|
<!-- Currency -->
|
||||||
|
<t t-if="options.get('company_currency')">
|
||||||
|
<div class="row">
|
||||||
|
<div class="col-3">Currency:</div>
|
||||||
|
<div class="col-9 o_text_muted" t-out="options['company_currency']['currency_name']"/>
|
||||||
|
</div>
|
||||||
|
</t>
|
||||||
|
|
||||||
|
<!-- Filters -->
|
||||||
|
<t t-if="options.get('aml_ir_filters') and any(opt['selected'] for opt in options['aml_ir_filters'])" name="aml_ir_filters">
|
||||||
|
<div class="row">
|
||||||
|
<t t-set="aml_ir_filters" t-value="opt['name'] for opt in options['aml_ir_filters'] if opt['selected']"/>
|
||||||
|
<t t-if="aml_ir_filters">
|
||||||
|
<div class="col-3">Filters:</div>
|
||||||
|
<div class="col-9 o_text_muted" t-out="', '.join(aml_ir_filters)"/>
|
||||||
|
</t>
|
||||||
|
</div>
|
||||||
|
</t>
|
||||||
|
|
||||||
|
<!-- Extra options -->
|
||||||
|
<div class="row" name="pdf_options_header">
|
||||||
|
<t t-call="{{custom_templates.get('pdf_export_filter_extra_options_template', 'fusion_accounting.pdf_export_filter_extra_options_template')}}"/>
|
||||||
|
</div>
|
||||||
|
</template>
|
||||||
|
|
||||||
|
<template id="pdf_export_filter_extra_options_template">
|
||||||
|
<t t-set="rounding_unit_display_names" t-value="{k: v[1] for k, v in options['rounding_unit_names'].items() if v[1]}"/>
|
||||||
|
<div class="col-3" t-if="(report.filter_show_draft and options['all_entries']) or
|
||||||
|
(report.filter_unreconciled and options['unreconciled']) or
|
||||||
|
options.get('include_analytic_without_aml') or
|
||||||
|
options['rounding_unit'] in rounding_unit_display_names">
|
||||||
|
Options:
|
||||||
|
</div>
|
||||||
|
<div class="col-9 o_text_muted">
|
||||||
|
<t t-set="extra_options" t-value="[]"/>
|
||||||
|
|
||||||
|
<!-- All entries -->
|
||||||
|
<t t-if="report.filter_show_draft and options['all_entries']" groups="account.group_account_readonly">
|
||||||
|
<t t-set="label_draft_entries">With Draft Entries</t>
|
||||||
|
<t t-set="extra_options" t-value="extra_options + [label_draft_entries]"/>
|
||||||
|
</t>
|
||||||
|
|
||||||
|
<!-- Unreconciled -->
|
||||||
|
<t t-if="report.filter_unreconciled and options['unreconciled']">
|
||||||
|
<t t-set="label_unreconciled_entries">Unreconciled Entries</t>
|
||||||
|
<t t-set="extra_options" t-value="extra_options + [label_unreconciled_entries]"/>
|
||||||
|
</t>
|
||||||
|
|
||||||
|
<!-- Analytic -->
|
||||||
|
<t t-if="options.get('include_analytic_without_aml')" name="include_analytic">
|
||||||
|
<t t-set="label_analytic_simulations">Including Analytic Simulations</t>
|
||||||
|
<t t-set="extra_options" t-value="extra_options + [label_analytic_simulations]"/>
|
||||||
|
</t>
|
||||||
|
|
||||||
|
<!-- Currency Unit Amount Text -->
|
||||||
|
<t t-if="options['rounding_unit'] in rounding_unit_display_names">
|
||||||
|
<t t-set="rounding_unit" t-value="options.get('rounding_unit')"/>
|
||||||
|
<t t-set="extra_options" t-value="extra_options + [rounding_unit_display_names[rounding_unit]]"/>
|
||||||
|
</t>
|
||||||
|
|
||||||
|
<t t-out="', '.join(extra_options)"/>
|
||||||
|
</div>
|
||||||
|
</template>
|
||||||
|
|
||||||
|
<template id="pdf_export_main_table_header">
|
||||||
|
<thead id="table_header">
|
||||||
|
<t t-foreach="options['column_headers']" t-as="column_header">
|
||||||
|
<tr>
|
||||||
|
<!-- First empty column -->
|
||||||
|
<th/>
|
||||||
|
|
||||||
|
<!-- Other columns -->
|
||||||
|
<t t-foreach="column_header * column_headers_render_data['level_repetitions'][column_header_index]" t-as="header">
|
||||||
|
<th t-att-colspan="header.get('colspan', column_headers_render_data['level_colspan'][column_header_index]) + (1 if options.get('show_horizontal_group_total') and column_header_first else 0)" class="o_overflow_name">
|
||||||
|
<t t-out="header.get('name')"/>
|
||||||
|
</th>
|
||||||
|
</t>
|
||||||
|
|
||||||
|
<th t-if="options.get('show_horizontal_group_total') and not column_header_first">
|
||||||
|
<t t-out="[group['name'] for group in options['available_horizontal_groups'] if group['id'] == options['selected_horizontal_group_id']][0]"/>
|
||||||
|
</th>
|
||||||
|
|
||||||
|
<th t-if="options.get('column_percent_comparison') == 'growth'">%</th>
|
||||||
|
</tr>
|
||||||
|
</t>
|
||||||
|
<!-- Custom subheaders -->
|
||||||
|
<t t-if="column_headers_render_data['custom_subheaders']">
|
||||||
|
<tr>
|
||||||
|
<!-- First empty column -->
|
||||||
|
<th/>
|
||||||
|
|
||||||
|
<!-- Other columns -->
|
||||||
|
<t t-foreach="column_headers_render_data['custom_subheaders']" t-as="subheader">
|
||||||
|
<th t-att-colspan="subheader.get('colspan', 1)">
|
||||||
|
<t t-out="subheader.get('name')"/>
|
||||||
|
</th>
|
||||||
|
</t>
|
||||||
|
</tr>
|
||||||
|
</t>
|
||||||
|
<tr>
|
||||||
|
<!-- First empty column -->
|
||||||
|
<th/>
|
||||||
|
|
||||||
|
<t t-foreach="options['columns']" t-as="subheader">
|
||||||
|
<th>
|
||||||
|
<t t-out="subheader.get('name')"/>
|
||||||
|
</th>
|
||||||
|
</t>
|
||||||
|
<th t-if="options.get('show_horizontal_group_total')">
|
||||||
|
<t t-out="options['columns'][0].get('name')"/>
|
||||||
|
</th>
|
||||||
|
<th t-if="options.get('column_percent_comparison') == 'growth'"/>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
</template>
|
||||||
|
|
||||||
|
<template id="pdf_export_main_table_body">
|
||||||
|
<t t-foreach="lines" t-as="line">
|
||||||
|
<t t-set="o_line_level" t-value="'o_line_level_' + str(line['level'])"/>
|
||||||
|
|
||||||
|
<t t-if="line.get('page_break') and not options.get('horizontal_split')">
|
||||||
|
<!-- End current table -->
|
||||||
|
<t t-out="table_end"/>
|
||||||
|
|
||||||
|
<!-- Append table header -->
|
||||||
|
<t t-call="{{custom_templates.get('pdf_export_main_table_header', 'fusion_accounting.pdf_export_main_table_header')}}"/>
|
||||||
|
|
||||||
|
<!-- Start new table -->
|
||||||
|
<t t-out="table_start"/>
|
||||||
|
</t>
|
||||||
|
|
||||||
|
<!-- Adds an empty row above line with level 0 to add some spacing (it is the easiest and cleanest way) -->
|
||||||
|
<t t-if="line_index != 0 and line['level'] == 0">
|
||||||
|
<tr>
|
||||||
|
<td/>
|
||||||
|
|
||||||
|
<t t-foreach="line.get('columns')" t-as="cell">
|
||||||
|
<td/>
|
||||||
|
</t>
|
||||||
|
|
||||||
|
<t t-if="options.get('column_percent_comparison')">
|
||||||
|
<td/>
|
||||||
|
</t>
|
||||||
|
|
||||||
|
<t t-if="options.get('show_horizontal_group_total')">
|
||||||
|
<td/>
|
||||||
|
</t>
|
||||||
|
</tr>
|
||||||
|
</t>
|
||||||
|
|
||||||
|
<t t-set="o_bold" t-value="(' o_fw_bold' if line.get('unfolded') or 'total' in line.get('id') else '')"/>
|
||||||
|
<t t-set="o_overflow" t-value="(' o_overflow_name' if len(line.get('name') or '') > 42 else '')"/>
|
||||||
|
|
||||||
|
<tr t-att-class="o_line_level + o_bold + o_overflow" name="pdf_export_main_table_body_lines_tr">
|
||||||
|
<td t-att-colspan="line.get('colspan', '1')" class="o_line_name_level">
|
||||||
|
<t t-out="line.get('name')"/>
|
||||||
|
<t t-if="line.get('annotations')">
|
||||||
|
<t t-foreach="annotations" t-as="annotation">
|
||||||
|
<t t-if="annotation.get('number') and annotation['number'] in (line.get('annotations') or [])">
|
||||||
|
<sup t-out="annotation['number']"/>
|
||||||
|
</t>
|
||||||
|
</t>
|
||||||
|
</t>
|
||||||
|
</td>
|
||||||
|
|
||||||
|
<t t-foreach="line.get('columns')" t-as="cell">
|
||||||
|
<td class="o_cell_td">
|
||||||
|
<t t-if="not env.company.totals_below_sections or options.get('ignore_totals_below_sections') or not line.get('unfolded')">
|
||||||
|
<t t-call="{{custom_templates.get('pdf_export_cell', 'fusion_accounting.pdf_export_cell')}}"/>
|
||||||
|
</t>
|
||||||
|
</td>
|
||||||
|
</t>
|
||||||
|
|
||||||
|
<t t-if="options.get('column_percent_comparison')">
|
||||||
|
<td class="o_column_percent_comparison">
|
||||||
|
<t t-if="line.get('column_percent_comparison_data')">
|
||||||
|
<t t-out="line['column_percent_comparison_data'].get('name')"/>
|
||||||
|
</t>
|
||||||
|
</td>
|
||||||
|
</t>
|
||||||
|
|
||||||
|
<t t-if="options.get('show_horizontal_group_total')">
|
||||||
|
<td class="o_cell_td">
|
||||||
|
<t t-if="line.get('horizontal_group_total_data')">
|
||||||
|
<t t-set="o_classes" t-value="'o_line_cell_value_number' + (' o_muted' if line['horizontal_group_total_data'].get('no_format') == 0 else '')"/>
|
||||||
|
<span t-att-class="o_classes" t-out="line['horizontal_group_total_data'].get('name')"/>
|
||||||
|
</t>
|
||||||
|
</td>
|
||||||
|
</t>
|
||||||
|
</tr>
|
||||||
|
</t>
|
||||||
|
</template>
|
||||||
|
|
||||||
|
<template id="pdf_export_cell">
|
||||||
|
<t t-if="cell.get('figure_type', '') in ['float', 'integer', 'monetary', 'percentage']">
|
||||||
|
<t t-set="o_classes" t-value="'o_line_cell_value_number' + (' o_muted' if cell.get('is_zero') else '')"/>
|
||||||
|
</t>
|
||||||
|
<t t-else="">
|
||||||
|
<t t-set="o_classes" t-value="'o_overflow_value'"/>
|
||||||
|
</t>
|
||||||
|
|
||||||
|
<span t-att-class="o_classes" t-out="cell.get('name')"/>
|
||||||
|
</template>
|
||||||
|
</odoo>
|
||||||
134
Fusion Accounting/data/profit_and_loss.xml
Normal file
134
Fusion Accounting/data/profit_and_loss.xml
Normal file
@@ -0,0 +1,134 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<odoo>
|
||||||
|
<record id="profit_and_loss" model="account.report">
|
||||||
|
<field name="name">Profit and Loss</field>
|
||||||
|
<field name="filter_analytic_groupby" eval="True"/>
|
||||||
|
<field name="filter_unfold_all" eval="True"/>
|
||||||
|
<field name="filter_journals" eval="True"/>
|
||||||
|
<field name="filter_multi_company">selector</field>
|
||||||
|
<field name="filter_budgets" eval="True"/>
|
||||||
|
<field name="default_opening_date_filter">this_year</field>
|
||||||
|
<field name="column_ids">
|
||||||
|
<record id="profit_and_loss_column" model="account.report.column">
|
||||||
|
<field name="name">Balance</field>
|
||||||
|
<field name="expression_label">balance</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
<field name="line_ids">
|
||||||
|
<record id="account_financial_report_revenue0" model="account.report.line">
|
||||||
|
<field name="name">Revenue</field>
|
||||||
|
<field name="code">REV</field>
|
||||||
|
<field name="hierarchy_level">1</field>
|
||||||
|
<field name="groupby">account_id</field>
|
||||||
|
<field name="foldable" eval="True"/>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_revenue0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">domain</field>
|
||||||
|
<field name="formula" eval="[('account_id.account_type', '=', 'income')]"/>
|
||||||
|
<field name="subformula">-sum</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_cost_sales0" model="account.report.line">
|
||||||
|
<field name="name">Less Costs of Revenue</field>
|
||||||
|
<field name="code">COS</field>
|
||||||
|
<field name="hierarchy_level">1</field>
|
||||||
|
<field name="groupby">account_id</field>
|
||||||
|
<field name="foldable" eval="True"/>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_cost_sales0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">domain</field>
|
||||||
|
<field name="formula" eval="[('account_id.account_type', '=', 'expense_direct_cost')]"/>
|
||||||
|
<field name="subformula">sum</field>
|
||||||
|
<field name="green_on_positive" eval="False"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_gross_profit0" model="account.report.line">
|
||||||
|
<field name="name">Gross Profit</field>
|
||||||
|
<field name="code">GRP</field>
|
||||||
|
<field name="hierarchy_level">0</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_gross_profit0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">REV.balance - COS.balance</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_expense0" model="account.report.line">
|
||||||
|
<field name="name">Less Operating Expenses</field>
|
||||||
|
<field name="code">EXP</field>
|
||||||
|
<field name="hierarchy_level">1</field>
|
||||||
|
<field name="groupby">account_id</field>
|
||||||
|
<field name="foldable" eval="True"/>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_expense0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">domain</field>
|
||||||
|
<field name="formula" eval="[('account_id.account_type', '=', 'expense')]"/>
|
||||||
|
<field name="subformula">sum</field>
|
||||||
|
<field name="green_on_positive" eval="False"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_operating_income0" model="account.report.line">
|
||||||
|
<field name="name">Operating Income (or Loss)</field>
|
||||||
|
<field name="hierarchy_level">0</field>
|
||||||
|
<field name="code">INC</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_operating_income0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">REV.balance - COS.balance - EXP.balance</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_other_income0" model="account.report.line">
|
||||||
|
<field name="name">Plus Other Income</field>
|
||||||
|
<field name="code">OIN</field>
|
||||||
|
<field name="hierarchy_level">1</field>
|
||||||
|
<field name="groupby">account_id</field>
|
||||||
|
<field name="foldable" eval="True"/>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_other_income0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">domain</field>
|
||||||
|
<field name="formula" eval="[('account_id.account_type', '=', 'income_other')]"/>
|
||||||
|
<field name="subformula">-sum</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_depreciation0" model="account.report.line">
|
||||||
|
<field name="name">Less Other Expenses</field>
|
||||||
|
<field name="code">OEXP</field>
|
||||||
|
<field name="hierarchy_level">1</field>
|
||||||
|
<field name="groupby">account_id</field>
|
||||||
|
<field name="foldable" eval="True"/>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_depreciation0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">domain</field>
|
||||||
|
<field name="formula" eval="[('account_id.account_type', '=', 'expense_depreciation')]"/>
|
||||||
|
<field name="subformula">sum</field>
|
||||||
|
<field name="green_on_positive" eval="False"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_net_profit0" model="account.report.line">
|
||||||
|
<field name="name">Net Profit</field>
|
||||||
|
<field name="hierarchy_level">0</field>
|
||||||
|
<field name="code">NEP</field>
|
||||||
|
<field name="expression_ids">
|
||||||
|
<record id="account_financial_report_net_profit0_balance" model="account.report.expression">
|
||||||
|
<field name="label">balance</field>
|
||||||
|
<field name="engine">aggregation</field>
|
||||||
|
<field name="formula">REV.balance + OIN.balance - COS.balance - EXP.balance - OEXP.balance</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</odoo>
|
||||||
11
Fusion Accounting/data/report_send_cron.xml
Normal file
11
Fusion Accounting/data/report_send_cron.xml
Normal file
@@ -0,0 +1,11 @@
|
|||||||
|
<odoo>
|
||||||
|
<record id="ir_cron_account_report_send" model="ir.cron">
|
||||||
|
<field name="name">Send account reports automatically</field>
|
||||||
|
<field name="model_id" ref="model_account_report"/>
|
||||||
|
<field name="state">code</field>
|
||||||
|
<field name="code">model._cron_account_report_send(job_count=20)</field>
|
||||||
|
<field name="user_id" ref="base.user_root"/>
|
||||||
|
<field name="interval_number">1</field>
|
||||||
|
<field name="interval_type">days</field>
|
||||||
|
</record>
|
||||||
|
</odoo>
|
||||||
36
Fusion Accounting/data/sales_report.xml
Normal file
36
Fusion Accounting/data/sales_report.xml
Normal file
@@ -0,0 +1,36 @@
|
|||||||
|
<?xml version="1.0" encoding="UTF-8" ?>
|
||||||
|
<odoo>
|
||||||
|
<record id="generic_ec_sales_report" model="account.report">
|
||||||
|
<field name="name">Generic EC Sales List</field>
|
||||||
|
<field name="filter_show_draft" eval="True"/>
|
||||||
|
<field name="filter_period_comparison" eval="False"/>
|
||||||
|
<field name="filter_date_range" eval="True"/>
|
||||||
|
<field name="filter_journals" eval="True"/>
|
||||||
|
<field name="filter_show_draft" eval="False"/>
|
||||||
|
<field name="filter_unreconciled" eval="False"/>
|
||||||
|
<field name="filter_multi_company">selector</field>
|
||||||
|
<field name="default_opening_date_filter">previous_month</field>
|
||||||
|
<field name="load_more_limit" eval="80"/>
|
||||||
|
<field name="search_bar" eval="True"/>
|
||||||
|
<field name="custom_handler_model_id" ref="model_account_ec_sales_report_handler"/>
|
||||||
|
<field name="column_ids">
|
||||||
|
<record id="account_financial_report_ec_sales_country" model="account.report.column">
|
||||||
|
<field name="name">Country Code</field>
|
||||||
|
<field name="expression_label">country_code</field>
|
||||||
|
<field name="figure_type">string</field>
|
||||||
|
<field name="sortable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_ec_sales_vat" model="account.report.column">
|
||||||
|
<field name="name">VAT Number</field>
|
||||||
|
<field name="expression_label">vat_number</field>
|
||||||
|
<field name="figure_type">string</field>
|
||||||
|
<field name="sortable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
<record id="account_financial_report_ec_sales_amount" model="account.report.column">
|
||||||
|
<field name="name">Amount</field>
|
||||||
|
<field name="expression_label">balance</field>
|
||||||
|
<field name="sortable" eval="True"/>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</odoo>
|
||||||
26
Fusion Accounting/data/trial_balance.xml
Normal file
26
Fusion Accounting/data/trial_balance.xml
Normal file
@@ -0,0 +1,26 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<odoo>
|
||||||
|
<record id="trial_balance_report" model="account.report">
|
||||||
|
<field name="name">Trial Balance</field>
|
||||||
|
<field name="filter_journals" eval="True"/>
|
||||||
|
<field name="filter_analytic" eval="True"/>
|
||||||
|
<field name="filter_growth_comparison" eval="False"/>
|
||||||
|
<field name="filter_multi_company">selector</field>
|
||||||
|
<field name="filter_unfold_all" eval="True"/>
|
||||||
|
<field name="filter_hierarchy">by_default</field>
|
||||||
|
<field name="filter_hide_0_lines">never</field>
|
||||||
|
<field name="default_opening_date_filter">this_month</field>
|
||||||
|
<field name="search_bar" eval="True"/>
|
||||||
|
<field name="custom_handler_model_id" ref="model_account_trial_balance_report_handler"/>
|
||||||
|
<field name="column_ids">
|
||||||
|
<record id="trial_balance_report_debit" model="account.report.column">
|
||||||
|
<field name="name">Debit</field>
|
||||||
|
<field name="expression_label">debit</field>
|
||||||
|
</record>
|
||||||
|
<record id="trial_balance_report_credit" model="account.report.column">
|
||||||
|
<field name="name">Credit</field>
|
||||||
|
<field name="expression_label">credit</field>
|
||||||
|
</record>
|
||||||
|
</field>
|
||||||
|
</record>
|
||||||
|
</odoo>
|
||||||
32
Fusion Accounting/demo/fusion_accounting_demo.xml
Normal file
32
Fusion Accounting/demo/fusion_accounting_demo.xml
Normal file
@@ -0,0 +1,32 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<odoo>
|
||||||
|
<record id="base.user_demo" model="res.users">
|
||||||
|
<field name="groups_id" eval="[(4, ref('account.group_account_user'))]"/>
|
||||||
|
</record>
|
||||||
|
<data noupdate="1">
|
||||||
|
<record id="account_asset_group_demo" model="account.asset.group">
|
||||||
|
<field name="name">Odoo Office</field>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="account_asset_model_demo" model="account.asset">
|
||||||
|
<field name="name">Asset - 5 Years</field>
|
||||||
|
<field name="prorata_computation_type">none</field>
|
||||||
|
<field name="original_value">1000</field>
|
||||||
|
<field name="journal_id" model="account.journal" search="[
|
||||||
|
('type', '=', 'general'),
|
||||||
|
('id', '!=', obj().env.user.company_id.currency_exchange_journal_id.id)]"/>
|
||||||
|
<field name="account_asset_id" model="account.account" search="[
|
||||||
|
('account_type', '=', 'asset_fixed'),
|
||||||
|
('company_ids', '=', ref('base.main_company'))]"/>
|
||||||
|
<field name="account_depreciation_id" model="account.account" search="[
|
||||||
|
('account_type', '=', 'asset_fixed'),
|
||||||
|
('company_ids', '=', ref('base.main_company'))]"/>
|
||||||
|
<field name="account_depreciation_expense_id" model="account.account" search="[
|
||||||
|
('account_type', '=', 'expense'),
|
||||||
|
('tag_ids', 'in', [ref('account.account_tag_operating')]),
|
||||||
|
('company_ids', '=', ref('base.main_company'))]"/>
|
||||||
|
<field name="state">open</field>
|
||||||
|
<field name="asset_group_id" ref="fusion_accounting.account_asset_group_demo"/>
|
||||||
|
</record>
|
||||||
|
</data>
|
||||||
|
</odoo>
|
||||||
30
Fusion Accounting/demo/partner_bank.xml
Normal file
30
Fusion Accounting/demo/partner_bank.xml
Normal file
@@ -0,0 +1,30 @@
|
|||||||
|
<?xml version="1.0" encoding="utf-8"?>
|
||||||
|
<odoo>
|
||||||
|
<data>
|
||||||
|
|
||||||
|
<record id="ofx_partner_bank_1" model="res.partner.bank">
|
||||||
|
<field name="acc_number">BE68539007547034</field>
|
||||||
|
<field name="partner_id" ref="base.res_partner_2"></field>
|
||||||
|
<field name="bank_id" ref="base.res_bank_1"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="ofx_partner_bank_2" model="res.partner.bank">
|
||||||
|
<field name="acc_number">00987654322</field>
|
||||||
|
<field name="partner_id" ref="base.res_partner_3"></field>
|
||||||
|
<field name="bank_id" ref="base.res_bank_1"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="qif_partner_bank_1" model="res.partner.bank">
|
||||||
|
<field name="acc_number">10987654320</field>
|
||||||
|
<field name="partner_id" ref="base.res_partner_4"></field>
|
||||||
|
<field name="bank_id" ref="base.res_bank_1"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
<record id="qif_partner_bank_2" model="res.partner.bank">
|
||||||
|
<field name="acc_number">10987654322</field>
|
||||||
|
<field name="partner_id" ref="base.res_partner_3"></field>
|
||||||
|
<field name="bank_id" ref="base.res_bank_1"/>
|
||||||
|
</record>
|
||||||
|
|
||||||
|
</data>
|
||||||
|
</odoo>
|
||||||
302
Fusion Accounting/i18n/ar.po
Normal file
302
Fusion Accounting/i18n/ar.po
Normal file
@@ -0,0 +1,302 @@
|
|||||||
|
# Translation of Odoo Server.
|
||||||
|
# This file contains the translation of the following modules:
|
||||||
|
# * at_accounting
|
||||||
|
#
|
||||||
|
msgid ""
|
||||||
|
msgstr ""
|
||||||
|
"Project-Id-Version: Odoo Server 18.0\n"
|
||||||
|
"Report-Msgid-Bugs-To: \n"
|
||||||
|
"POT-Creation-Date: 2025-01-17 23:55+0000\n"
|
||||||
|
"PO-Revision-Date: 2025-01-17 23:55+0000\n"
|
||||||
|
"Last-Translator: \n"
|
||||||
|
"Language-Team: \n"
|
||||||
|
"MIME-Version: 1.0\n"
|
||||||
|
"Content-Type: text/plain; charset=UTF-8\n"
|
||||||
|
"Content-Transfer-Encoding: \n"
|
||||||
|
"Plural-Forms: \n"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.menu,name:at_accounting.menu_accounting
|
||||||
|
msgid "Accounting"
|
||||||
|
msgstr "المحاسبة"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Accounting"
|
||||||
|
msgstr "المحاسبة"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.module.category,name:account.module_category_accounting
|
||||||
|
msgid "Accounting"
|
||||||
|
msgstr "المحاسبة"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Accounting"
|
||||||
|
msgstr "المحاسبة"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Fiscal Year"
|
||||||
|
msgstr "السنة المالية"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Last Day"
|
||||||
|
msgstr "اليوم الأخير"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Fiscal Years"
|
||||||
|
msgstr "السنوات المالية"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Legal signatory"
|
||||||
|
msgstr "الموقع القانوني"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Predict vendor bill product"
|
||||||
|
msgstr "توقع منتج فاتورة المورد"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Deferred expense entries:"
|
||||||
|
msgstr "إدخالات المصروفات المؤجلة:"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Journal"
|
||||||
|
msgstr "اليومية"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Deferred expense"
|
||||||
|
msgstr "المصروفات المؤجلة"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Generate Entries"
|
||||||
|
msgstr "إنشاء الإدخالات"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Based on"
|
||||||
|
msgstr "بناءً على"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Deferred revenue entries:"
|
||||||
|
msgstr "إدخالات الإيرادات المؤجلة:"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Deferred revenue"
|
||||||
|
msgstr "الإيرادات المؤجلة"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Tax Return Periodicity"
|
||||||
|
msgstr "دورية الإقرار الضريبي"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Periodicity"
|
||||||
|
msgstr "الدورية"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Reminder"
|
||||||
|
msgstr "تذكير"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Configure your tax accounts"
|
||||||
|
msgstr "تكوين حساباتك الضريبية"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Configure start dates"
|
||||||
|
msgstr "تكوين تواريخ البدء"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Download the Data Inalterability Check Report"
|
||||||
|
msgstr "تحميل تقرير فحص عدم تغيير البيانات"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.model.fields,field_description:at_accounting.field_res_config_settings__configure_your_start_dates
|
||||||
|
msgid "Configure your start dates"
|
||||||
|
msgstr "تكوين تواريخ البدء"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Invoicing Switch Threshold"
|
||||||
|
msgstr "حد تبديل الفواتير"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "The invoices up to this date will not be taken into account as accounting entries"
|
||||||
|
msgstr "الفواتير حتى هذا التاريخ لن تؤخذ في الاعتبار كإدخالات محاسبية"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Define fiscal years of more or less than one year"
|
||||||
|
msgstr "تحديد سنوات مالية أكثر أو أقل من سنة واحدة"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Record cost of goods sold in your journal entries"
|
||||||
|
msgstr "تسجيل تكلفة البضائع المباعة في إدخالات اليومية"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "To enhance authenticity, add a signature to your invoices"
|
||||||
|
msgstr "لتعزيز الأصالة، أضف توقيعاً إلى فواتيرك"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "The system will try to predict the product on vendor bill lines based on the label of the line"
|
||||||
|
msgstr "سيحاول النظام توقع المنتج في أسطر فاتورة المورد بناءً على تسمية السطر"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "How often tax returns have to be made"
|
||||||
|
msgstr "عدد مرات تقديم الإقرارات الضريبية"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "days after period"
|
||||||
|
msgstr "أيام بعد الفترة"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Reporting"
|
||||||
|
msgstr "التقارير"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "This allows you to choose the position of totals in your financial reports."
|
||||||
|
msgstr "يتيح لك هذا اختيار موضع المجاميع في تقاريرك المالية."
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "When ticked, totals and subtotals appear below the sections of the report"
|
||||||
|
msgstr "عند التحديد، تظهر المجاميع والمجاميع الفرعية أسفل أقسام التقرير"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "After importing three bills for a vendor without making changes, your ERP will suggest automatically validating future bills..."
|
||||||
|
msgstr "بعد استيراد ثلاث فواتير لمورد دون إجراء تغييرات، سيقترح نظام إدارة الموارد تلقائياً التحقق من صحة الفواتير المستقبلية..."
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.model.fields,field_description:at_accounting.field_res_config_settings__anglo_saxon_accounting
|
||||||
|
msgid "Anglo-Saxon Accounting"
|
||||||
|
msgstr "المحاسبة الأنجلو ساكسونية"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.model.fields,field_description:at_accounting.field_res_config_settings__predict_bill_product
|
||||||
|
msgid "Predict Bill Product"
|
||||||
|
msgstr "توقع منتج الفاتورة"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.model.fields,field_description:at_accounting.field_res_config_settings__authorized_signatory_on_invoice
|
||||||
|
msgid "Authorized Signatory on invoice"
|
||||||
|
msgstr "الموقع المفوض على الفاتورة"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.model.fields,field_description:at_accounting.field_res_config_settings__signature_used_to_sign_all_the_invoice
|
||||||
|
msgid "Signature used to sign all the invoice"
|
||||||
|
msgstr "التوقيع المستخدم لتوقيع جميع الفواتير"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.model.fields,field_description:at_accounting.field_res_config_settings__sign
|
||||||
|
msgid "Sign"
|
||||||
|
msgstr "التوقيع"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.model.fields,field_description:at_accounting.field_res_config_settings__add_totals_below_sections
|
||||||
|
msgid "Add totals below sections"
|
||||||
|
msgstr "إضافة المجاميع أسفل الأقسام"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.model.fields,field_description:at_accounting.field_res_config_settings__periodicity
|
||||||
|
msgid "Periodicity"
|
||||||
|
msgstr "الدورية"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.model.fields,field_description:at_accounting.field_res_config_settings__reminder
|
||||||
|
msgid "Reminder"
|
||||||
|
msgstr "تذكير"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.model.fields,field_description:at_accounting.field_res_config_settings__journal
|
||||||
|
msgid "Journal"
|
||||||
|
msgstr "اليومية"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.model.fields,field_description:at_accounting.field_res_config_settings__vat_periodicity
|
||||||
|
msgid "VAT Periodicity"
|
||||||
|
msgstr "دورية ضريبة القيمة المضافة"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.account_override_autopost_bills
|
||||||
|
msgid "Tax groups"
|
||||||
|
msgstr "مجموعات الضرائب"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Export"
|
||||||
|
msgstr "تصدير"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Cancel"
|
||||||
|
msgstr "إلغاء"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Save"
|
||||||
|
msgstr "حفظ"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Discard"
|
||||||
|
msgstr "إهمال"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Validate"
|
||||||
|
msgstr "تحقق"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Reconcile"
|
||||||
|
msgstr "مطابقة"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Close"
|
||||||
|
msgstr "إغلاق"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Amount"
|
||||||
|
msgstr "المبلغ"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Notes"
|
||||||
|
msgstr "ملاحظات"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "Date"
|
||||||
|
msgstr "التاريخ"
|
||||||
|
|
||||||
|
#. module: at_accounting
|
||||||
|
#: model:ir.ui.view,name:at_accounting.res_config_settings_view_form
|
||||||
|
msgid "View"
|
||||||
|
msgstr "عرض"
|
||||||
@@ -0,0 +1,5 @@
|
|||||||
|
"""Run the EasyInstall command"""
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
from setuptools.command.easy_install import main
|
||||||
|
main()
|
||||||
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,552 @@
|
|||||||
|
#!/usr/bin/env python
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
# Copyright (c) 2005-2010 ActiveState Software Inc.
|
||||||
|
# Copyright (c) 2013 Eddy Petrișor
|
||||||
|
|
||||||
|
"""Utilities for determining application-specific dirs.
|
||||||
|
|
||||||
|
See <http://github.com/ActiveState/appdirs> for details and usage.
|
||||||
|
"""
|
||||||
|
# Dev Notes:
|
||||||
|
# - MSDN on where to store app data files:
|
||||||
|
# http://support.microsoft.com/default.aspx?scid=kb;en-us;310294#XSLTH3194121123120121120120
|
||||||
|
# - Mac OS X: http://developer.apple.com/documentation/MacOSX/Conceptual/BPFileSystem/index.html
|
||||||
|
# - XDG spec for Un*x: http://standards.freedesktop.org/basedir-spec/basedir-spec-latest.html
|
||||||
|
|
||||||
|
__version_info__ = (1, 4, 0)
|
||||||
|
__version__ = '.'.join(map(str, __version_info__))
|
||||||
|
|
||||||
|
|
||||||
|
import sys
|
||||||
|
import os
|
||||||
|
|
||||||
|
PY3 = sys.version_info[0] == 3
|
||||||
|
|
||||||
|
if PY3:
|
||||||
|
unicode = str
|
||||||
|
|
||||||
|
if sys.platform.startswith('java'):
|
||||||
|
import platform
|
||||||
|
os_name = platform.java_ver()[3][0]
|
||||||
|
if os_name.startswith('Windows'): # "Windows XP", "Windows 7", etc.
|
||||||
|
system = 'win32'
|
||||||
|
elif os_name.startswith('Mac'): # "Mac OS X", etc.
|
||||||
|
system = 'darwin'
|
||||||
|
else: # "Linux", "SunOS", "FreeBSD", etc.
|
||||||
|
# Setting this to "linux2" is not ideal, but only Windows or Mac
|
||||||
|
# are actually checked for and the rest of the module expects
|
||||||
|
# *sys.platform* style strings.
|
||||||
|
system = 'linux2'
|
||||||
|
else:
|
||||||
|
system = sys.platform
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
def user_data_dir(appname=None, appauthor=None, version=None, roaming=False):
|
||||||
|
r"""Return full path to the user-specific data dir for this application.
|
||||||
|
|
||||||
|
"appname" is the name of application.
|
||||||
|
If None, just the system directory is returned.
|
||||||
|
"appauthor" (only used on Windows) is the name of the
|
||||||
|
appauthor or distributing body for this application. Typically
|
||||||
|
it is the owning company name. This falls back to appname. You may
|
||||||
|
pass False to disable it.
|
||||||
|
"version" is an optional version path element to append to the
|
||||||
|
path. You might want to use this if you want multiple versions
|
||||||
|
of your app to be able to run independently. If used, this
|
||||||
|
would typically be "<major>.<minor>".
|
||||||
|
Only applied when appname is present.
|
||||||
|
"roaming" (boolean, default False) can be set True to use the Windows
|
||||||
|
roaming appdata directory. That means that for users on a Windows
|
||||||
|
network setup for roaming profiles, this user data will be
|
||||||
|
sync'd on login. See
|
||||||
|
<http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
|
||||||
|
for a discussion of issues.
|
||||||
|
|
||||||
|
Typical user data directories are:
|
||||||
|
Mac OS X: ~/Library/Application Support/<AppName>
|
||||||
|
Unix: ~/.local/share/<AppName> # or in $XDG_DATA_HOME, if defined
|
||||||
|
Win XP (not roaming): C:\Documents and Settings\<username>\Application Data\<AppAuthor>\<AppName>
|
||||||
|
Win XP (roaming): C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>
|
||||||
|
Win 7 (not roaming): C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>
|
||||||
|
Win 7 (roaming): C:\Users\<username>\AppData\Roaming\<AppAuthor>\<AppName>
|
||||||
|
|
||||||
|
For Unix, we follow the XDG spec and support $XDG_DATA_HOME.
|
||||||
|
That means, by default "~/.local/share/<AppName>".
|
||||||
|
"""
|
||||||
|
if system == "win32":
|
||||||
|
if appauthor is None:
|
||||||
|
appauthor = appname
|
||||||
|
const = roaming and "CSIDL_APPDATA" or "CSIDL_LOCAL_APPDATA"
|
||||||
|
path = os.path.normpath(_get_win_folder(const))
|
||||||
|
if appname:
|
||||||
|
if appauthor is not False:
|
||||||
|
path = os.path.join(path, appauthor, appname)
|
||||||
|
else:
|
||||||
|
path = os.path.join(path, appname)
|
||||||
|
elif system == 'darwin':
|
||||||
|
path = os.path.expanduser('~/Library/Application Support/')
|
||||||
|
if appname:
|
||||||
|
path = os.path.join(path, appname)
|
||||||
|
else:
|
||||||
|
path = os.getenv('XDG_DATA_HOME', os.path.expanduser("~/.local/share"))
|
||||||
|
if appname:
|
||||||
|
path = os.path.join(path, appname)
|
||||||
|
if appname and version:
|
||||||
|
path = os.path.join(path, version)
|
||||||
|
return path
|
||||||
|
|
||||||
|
|
||||||
|
def site_data_dir(appname=None, appauthor=None, version=None, multipath=False):
|
||||||
|
"""Return full path to the user-shared data dir for this application.
|
||||||
|
|
||||||
|
"appname" is the name of application.
|
||||||
|
If None, just the system directory is returned.
|
||||||
|
"appauthor" (only used on Windows) is the name of the
|
||||||
|
appauthor or distributing body for this application. Typically
|
||||||
|
it is the owning company name. This falls back to appname. You may
|
||||||
|
pass False to disable it.
|
||||||
|
"version" is an optional version path element to append to the
|
||||||
|
path. You might want to use this if you want multiple versions
|
||||||
|
of your app to be able to run independently. If used, this
|
||||||
|
would typically be "<major>.<minor>".
|
||||||
|
Only applied when appname is present.
|
||||||
|
"multipath" is an optional parameter only applicable to *nix
|
||||||
|
which indicates that the entire list of data dirs should be
|
||||||
|
returned. By default, the first item from XDG_DATA_DIRS is
|
||||||
|
returned, or '/usr/local/share/<AppName>',
|
||||||
|
if XDG_DATA_DIRS is not set
|
||||||
|
|
||||||
|
Typical user data directories are:
|
||||||
|
Mac OS X: /Library/Application Support/<AppName>
|
||||||
|
Unix: /usr/local/share/<AppName> or /usr/share/<AppName>
|
||||||
|
Win XP: C:\Documents and Settings\All Users\Application Data\<AppAuthor>\<AppName>
|
||||||
|
Vista: (Fail! "C:\ProgramData" is a hidden *system* directory on Vista.)
|
||||||
|
Win 7: C:\ProgramData\<AppAuthor>\<AppName> # Hidden, but writeable on Win 7.
|
||||||
|
|
||||||
|
For Unix, this is using the $XDG_DATA_DIRS[0] default.
|
||||||
|
|
||||||
|
WARNING: Do not use this on Windows. See the Vista-Fail note above for why.
|
||||||
|
"""
|
||||||
|
if system == "win32":
|
||||||
|
if appauthor is None:
|
||||||
|
appauthor = appname
|
||||||
|
path = os.path.normpath(_get_win_folder("CSIDL_COMMON_APPDATA"))
|
||||||
|
if appname:
|
||||||
|
if appauthor is not False:
|
||||||
|
path = os.path.join(path, appauthor, appname)
|
||||||
|
else:
|
||||||
|
path = os.path.join(path, appname)
|
||||||
|
elif system == 'darwin':
|
||||||
|
path = os.path.expanduser('/Library/Application Support')
|
||||||
|
if appname:
|
||||||
|
path = os.path.join(path, appname)
|
||||||
|
else:
|
||||||
|
# XDG default for $XDG_DATA_DIRS
|
||||||
|
# only first, if multipath is False
|
||||||
|
path = os.getenv('XDG_DATA_DIRS',
|
||||||
|
os.pathsep.join(['/usr/local/share', '/usr/share']))
|
||||||
|
pathlist = [os.path.expanduser(x.rstrip(os.sep)) for x in path.split(os.pathsep)]
|
||||||
|
if appname:
|
||||||
|
if version:
|
||||||
|
appname = os.path.join(appname, version)
|
||||||
|
pathlist = [os.sep.join([x, appname]) for x in pathlist]
|
||||||
|
|
||||||
|
if multipath:
|
||||||
|
path = os.pathsep.join(pathlist)
|
||||||
|
else:
|
||||||
|
path = pathlist[0]
|
||||||
|
return path
|
||||||
|
|
||||||
|
if appname and version:
|
||||||
|
path = os.path.join(path, version)
|
||||||
|
return path
|
||||||
|
|
||||||
|
|
||||||
|
def user_config_dir(appname=None, appauthor=None, version=None, roaming=False):
|
||||||
|
r"""Return full path to the user-specific config dir for this application.
|
||||||
|
|
||||||
|
"appname" is the name of application.
|
||||||
|
If None, just the system directory is returned.
|
||||||
|
"appauthor" (only used on Windows) is the name of the
|
||||||
|
appauthor or distributing body for this application. Typically
|
||||||
|
it is the owning company name. This falls back to appname. You may
|
||||||
|
pass False to disable it.
|
||||||
|
"version" is an optional version path element to append to the
|
||||||
|
path. You might want to use this if you want multiple versions
|
||||||
|
of your app to be able to run independently. If used, this
|
||||||
|
would typically be "<major>.<minor>".
|
||||||
|
Only applied when appname is present.
|
||||||
|
"roaming" (boolean, default False) can be set True to use the Windows
|
||||||
|
roaming appdata directory. That means that for users on a Windows
|
||||||
|
network setup for roaming profiles, this user data will be
|
||||||
|
sync'd on login. See
|
||||||
|
<http://technet.microsoft.com/en-us/library/cc766489(WS.10).aspx>
|
||||||
|
for a discussion of issues.
|
||||||
|
|
||||||
|
Typical user data directories are:
|
||||||
|
Mac OS X: same as user_data_dir
|
||||||
|
Unix: ~/.config/<AppName> # or in $XDG_CONFIG_HOME, if defined
|
||||||
|
Win *: same as user_data_dir
|
||||||
|
|
||||||
|
For Unix, we follow the XDG spec and support $XDG_CONFIG_HOME.
|
||||||
|
That means, by deafult "~/.config/<AppName>".
|
||||||
|
"""
|
||||||
|
if system in ["win32", "darwin"]:
|
||||||
|
path = user_data_dir(appname, appauthor, None, roaming)
|
||||||
|
else:
|
||||||
|
path = os.getenv('XDG_CONFIG_HOME', os.path.expanduser("~/.config"))
|
||||||
|
if appname:
|
||||||
|
path = os.path.join(path, appname)
|
||||||
|
if appname and version:
|
||||||
|
path = os.path.join(path, version)
|
||||||
|
return path
|
||||||
|
|
||||||
|
|
||||||
|
def site_config_dir(appname=None, appauthor=None, version=None, multipath=False):
|
||||||
|
"""Return full path to the user-shared data dir for this application.
|
||||||
|
|
||||||
|
"appname" is the name of application.
|
||||||
|
If None, just the system directory is returned.
|
||||||
|
"appauthor" (only used on Windows) is the name of the
|
||||||
|
appauthor or distributing body for this application. Typically
|
||||||
|
it is the owning company name. This falls back to appname. You may
|
||||||
|
pass False to disable it.
|
||||||
|
"version" is an optional version path element to append to the
|
||||||
|
path. You might want to use this if you want multiple versions
|
||||||
|
of your app to be able to run independently. If used, this
|
||||||
|
would typically be "<major>.<minor>".
|
||||||
|
Only applied when appname is present.
|
||||||
|
"multipath" is an optional parameter only applicable to *nix
|
||||||
|
which indicates that the entire list of config dirs should be
|
||||||
|
returned. By default, the first item from XDG_CONFIG_DIRS is
|
||||||
|
returned, or '/etc/xdg/<AppName>', if XDG_CONFIG_DIRS is not set
|
||||||
|
|
||||||
|
Typical user data directories are:
|
||||||
|
Mac OS X: same as site_data_dir
|
||||||
|
Unix: /etc/xdg/<AppName> or $XDG_CONFIG_DIRS[i]/<AppName> for each value in
|
||||||
|
$XDG_CONFIG_DIRS
|
||||||
|
Win *: same as site_data_dir
|
||||||
|
Vista: (Fail! "C:\ProgramData" is a hidden *system* directory on Vista.)
|
||||||
|
|
||||||
|
For Unix, this is using the $XDG_CONFIG_DIRS[0] default, if multipath=False
|
||||||
|
|
||||||
|
WARNING: Do not use this on Windows. See the Vista-Fail note above for why.
|
||||||
|
"""
|
||||||
|
if system in ["win32", "darwin"]:
|
||||||
|
path = site_data_dir(appname, appauthor)
|
||||||
|
if appname and version:
|
||||||
|
path = os.path.join(path, version)
|
||||||
|
else:
|
||||||
|
# XDG default for $XDG_CONFIG_DIRS
|
||||||
|
# only first, if multipath is False
|
||||||
|
path = os.getenv('XDG_CONFIG_DIRS', '/etc/xdg')
|
||||||
|
pathlist = [os.path.expanduser(x.rstrip(os.sep)) for x in path.split(os.pathsep)]
|
||||||
|
if appname:
|
||||||
|
if version:
|
||||||
|
appname = os.path.join(appname, version)
|
||||||
|
pathlist = [os.sep.join([x, appname]) for x in pathlist]
|
||||||
|
|
||||||
|
if multipath:
|
||||||
|
path = os.pathsep.join(pathlist)
|
||||||
|
else:
|
||||||
|
path = pathlist[0]
|
||||||
|
return path
|
||||||
|
|
||||||
|
|
||||||
|
def user_cache_dir(appname=None, appauthor=None, version=None, opinion=True):
|
||||||
|
r"""Return full path to the user-specific cache dir for this application.
|
||||||
|
|
||||||
|
"appname" is the name of application.
|
||||||
|
If None, just the system directory is returned.
|
||||||
|
"appauthor" (only used on Windows) is the name of the
|
||||||
|
appauthor or distributing body for this application. Typically
|
||||||
|
it is the owning company name. This falls back to appname. You may
|
||||||
|
pass False to disable it.
|
||||||
|
"version" is an optional version path element to append to the
|
||||||
|
path. You might want to use this if you want multiple versions
|
||||||
|
of your app to be able to run independently. If used, this
|
||||||
|
would typically be "<major>.<minor>".
|
||||||
|
Only applied when appname is present.
|
||||||
|
"opinion" (boolean) can be False to disable the appending of
|
||||||
|
"Cache" to the base app data dir for Windows. See
|
||||||
|
discussion below.
|
||||||
|
|
||||||
|
Typical user cache directories are:
|
||||||
|
Mac OS X: ~/Library/Caches/<AppName>
|
||||||
|
Unix: ~/.cache/<AppName> (XDG default)
|
||||||
|
Win XP: C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>\Cache
|
||||||
|
Vista: C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>\Cache
|
||||||
|
|
||||||
|
On Windows the only suggestion in the MSDN docs is that local settings go in
|
||||||
|
the `CSIDL_LOCAL_APPDATA` directory. This is identical to the non-roaming
|
||||||
|
app data dir (the default returned by `user_data_dir` above). Apps typically
|
||||||
|
put cache data somewhere *under* the given dir here. Some examples:
|
||||||
|
...\Mozilla\Firefox\Profiles\<ProfileName>\Cache
|
||||||
|
...\Acme\SuperApp\Cache\1.0
|
||||||
|
OPINION: This function appends "Cache" to the `CSIDL_LOCAL_APPDATA` value.
|
||||||
|
This can be disabled with the `opinion=False` option.
|
||||||
|
"""
|
||||||
|
if system == "win32":
|
||||||
|
if appauthor is None:
|
||||||
|
appauthor = appname
|
||||||
|
path = os.path.normpath(_get_win_folder("CSIDL_LOCAL_APPDATA"))
|
||||||
|
if appname:
|
||||||
|
if appauthor is not False:
|
||||||
|
path = os.path.join(path, appauthor, appname)
|
||||||
|
else:
|
||||||
|
path = os.path.join(path, appname)
|
||||||
|
if opinion:
|
||||||
|
path = os.path.join(path, "Cache")
|
||||||
|
elif system == 'darwin':
|
||||||
|
path = os.path.expanduser('~/Library/Caches')
|
||||||
|
if appname:
|
||||||
|
path = os.path.join(path, appname)
|
||||||
|
else:
|
||||||
|
path = os.getenv('XDG_CACHE_HOME', os.path.expanduser('~/.cache'))
|
||||||
|
if appname:
|
||||||
|
path = os.path.join(path, appname)
|
||||||
|
if appname and version:
|
||||||
|
path = os.path.join(path, version)
|
||||||
|
return path
|
||||||
|
|
||||||
|
|
||||||
|
def user_log_dir(appname=None, appauthor=None, version=None, opinion=True):
|
||||||
|
r"""Return full path to the user-specific log dir for this application.
|
||||||
|
|
||||||
|
"appname" is the name of application.
|
||||||
|
If None, just the system directory is returned.
|
||||||
|
"appauthor" (only used on Windows) is the name of the
|
||||||
|
appauthor or distributing body for this application. Typically
|
||||||
|
it is the owning company name. This falls back to appname. You may
|
||||||
|
pass False to disable it.
|
||||||
|
"version" is an optional version path element to append to the
|
||||||
|
path. You might want to use this if you want multiple versions
|
||||||
|
of your app to be able to run independently. If used, this
|
||||||
|
would typically be "<major>.<minor>".
|
||||||
|
Only applied when appname is present.
|
||||||
|
"opinion" (boolean) can be False to disable the appending of
|
||||||
|
"Logs" to the base app data dir for Windows, and "log" to the
|
||||||
|
base cache dir for Unix. See discussion below.
|
||||||
|
|
||||||
|
Typical user cache directories are:
|
||||||
|
Mac OS X: ~/Library/Logs/<AppName>
|
||||||
|
Unix: ~/.cache/<AppName>/log # or under $XDG_CACHE_HOME if defined
|
||||||
|
Win XP: C:\Documents and Settings\<username>\Local Settings\Application Data\<AppAuthor>\<AppName>\Logs
|
||||||
|
Vista: C:\Users\<username>\AppData\Local\<AppAuthor>\<AppName>\Logs
|
||||||
|
|
||||||
|
On Windows the only suggestion in the MSDN docs is that local settings
|
||||||
|
go in the `CSIDL_LOCAL_APPDATA` directory. (Note: I'm interested in
|
||||||
|
examples of what some windows apps use for a logs dir.)
|
||||||
|
|
||||||
|
OPINION: This function appends "Logs" to the `CSIDL_LOCAL_APPDATA`
|
||||||
|
value for Windows and appends "log" to the user cache dir for Unix.
|
||||||
|
This can be disabled with the `opinion=False` option.
|
||||||
|
"""
|
||||||
|
if system == "darwin":
|
||||||
|
path = os.path.join(
|
||||||
|
os.path.expanduser('~/Library/Logs'),
|
||||||
|
appname)
|
||||||
|
elif system == "win32":
|
||||||
|
path = user_data_dir(appname, appauthor, version)
|
||||||
|
version = False
|
||||||
|
if opinion:
|
||||||
|
path = os.path.join(path, "Logs")
|
||||||
|
else:
|
||||||
|
path = user_cache_dir(appname, appauthor, version)
|
||||||
|
version = False
|
||||||
|
if opinion:
|
||||||
|
path = os.path.join(path, "log")
|
||||||
|
if appname and version:
|
||||||
|
path = os.path.join(path, version)
|
||||||
|
return path
|
||||||
|
|
||||||
|
|
||||||
|
class AppDirs(object):
|
||||||
|
"""Convenience wrapper for getting application dirs."""
|
||||||
|
def __init__(self, appname, appauthor=None, version=None, roaming=False,
|
||||||
|
multipath=False):
|
||||||
|
self.appname = appname
|
||||||
|
self.appauthor = appauthor
|
||||||
|
self.version = version
|
||||||
|
self.roaming = roaming
|
||||||
|
self.multipath = multipath
|
||||||
|
|
||||||
|
@property
|
||||||
|
def user_data_dir(self):
|
||||||
|
return user_data_dir(self.appname, self.appauthor,
|
||||||
|
version=self.version, roaming=self.roaming)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def site_data_dir(self):
|
||||||
|
return site_data_dir(self.appname, self.appauthor,
|
||||||
|
version=self.version, multipath=self.multipath)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def user_config_dir(self):
|
||||||
|
return user_config_dir(self.appname, self.appauthor,
|
||||||
|
version=self.version, roaming=self.roaming)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def site_config_dir(self):
|
||||||
|
return site_config_dir(self.appname, self.appauthor,
|
||||||
|
version=self.version, multipath=self.multipath)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def user_cache_dir(self):
|
||||||
|
return user_cache_dir(self.appname, self.appauthor,
|
||||||
|
version=self.version)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def user_log_dir(self):
|
||||||
|
return user_log_dir(self.appname, self.appauthor,
|
||||||
|
version=self.version)
|
||||||
|
|
||||||
|
|
||||||
|
#---- internal support stuff
|
||||||
|
|
||||||
|
def _get_win_folder_from_registry(csidl_name):
|
||||||
|
"""This is a fallback technique at best. I'm not sure if using the
|
||||||
|
registry for this guarantees us the correct answer for all CSIDL_*
|
||||||
|
names.
|
||||||
|
"""
|
||||||
|
import _winreg
|
||||||
|
|
||||||
|
shell_folder_name = {
|
||||||
|
"CSIDL_APPDATA": "AppData",
|
||||||
|
"CSIDL_COMMON_APPDATA": "Common AppData",
|
||||||
|
"CSIDL_LOCAL_APPDATA": "Local AppData",
|
||||||
|
}[csidl_name]
|
||||||
|
|
||||||
|
key = _winreg.OpenKey(
|
||||||
|
_winreg.HKEY_CURRENT_USER,
|
||||||
|
r"Software\Microsoft\Windows\CurrentVersion\Explorer\Shell Folders"
|
||||||
|
)
|
||||||
|
dir, type = _winreg.QueryValueEx(key, shell_folder_name)
|
||||||
|
return dir
|
||||||
|
|
||||||
|
|
||||||
|
def _get_win_folder_with_pywin32(csidl_name):
|
||||||
|
from win32com.shell import shellcon, shell
|
||||||
|
dir = shell.SHGetFolderPath(0, getattr(shellcon, csidl_name), 0, 0)
|
||||||
|
# Try to make this a unicode path because SHGetFolderPath does
|
||||||
|
# not return unicode strings when there is unicode data in the
|
||||||
|
# path.
|
||||||
|
try:
|
||||||
|
dir = unicode(dir)
|
||||||
|
|
||||||
|
# Downgrade to short path name if have highbit chars. See
|
||||||
|
# <http://bugs.activestate.com/show_bug.cgi?id=85099>.
|
||||||
|
has_high_char = False
|
||||||
|
for c in dir:
|
||||||
|
if ord(c) > 255:
|
||||||
|
has_high_char = True
|
||||||
|
break
|
||||||
|
if has_high_char:
|
||||||
|
try:
|
||||||
|
import win32api
|
||||||
|
dir = win32api.GetShortPathName(dir)
|
||||||
|
except ImportError:
|
||||||
|
pass
|
||||||
|
except UnicodeError:
|
||||||
|
pass
|
||||||
|
return dir
|
||||||
|
|
||||||
|
|
||||||
|
def _get_win_folder_with_ctypes(csidl_name):
|
||||||
|
import ctypes
|
||||||
|
|
||||||
|
csidl_const = {
|
||||||
|
"CSIDL_APPDATA": 26,
|
||||||
|
"CSIDL_COMMON_APPDATA": 35,
|
||||||
|
"CSIDL_LOCAL_APPDATA": 28,
|
||||||
|
}[csidl_name]
|
||||||
|
|
||||||
|
buf = ctypes.create_unicode_buffer(1024)
|
||||||
|
ctypes.windll.shell32.SHGetFolderPathW(None, csidl_const, None, 0, buf)
|
||||||
|
|
||||||
|
# Downgrade to short path name if have highbit chars. See
|
||||||
|
# <http://bugs.activestate.com/show_bug.cgi?id=85099>.
|
||||||
|
has_high_char = False
|
||||||
|
for c in buf:
|
||||||
|
if ord(c) > 255:
|
||||||
|
has_high_char = True
|
||||||
|
break
|
||||||
|
if has_high_char:
|
||||||
|
buf2 = ctypes.create_unicode_buffer(1024)
|
||||||
|
if ctypes.windll.kernel32.GetShortPathNameW(buf.value, buf2, 1024):
|
||||||
|
buf = buf2
|
||||||
|
|
||||||
|
return buf.value
|
||||||
|
|
||||||
|
def _get_win_folder_with_jna(csidl_name):
|
||||||
|
import array
|
||||||
|
from com.sun import jna
|
||||||
|
from com.sun.jna.platform import win32
|
||||||
|
|
||||||
|
buf_size = win32.WinDef.MAX_PATH * 2
|
||||||
|
buf = array.zeros('c', buf_size)
|
||||||
|
shell = win32.Shell32.INSTANCE
|
||||||
|
shell.SHGetFolderPath(None, getattr(win32.ShlObj, csidl_name), None, win32.ShlObj.SHGFP_TYPE_CURRENT, buf)
|
||||||
|
dir = jna.Native.toString(buf.tostring()).rstrip("\0")
|
||||||
|
|
||||||
|
# Downgrade to short path name if have highbit chars. See
|
||||||
|
# <http://bugs.activestate.com/show_bug.cgi?id=85099>.
|
||||||
|
has_high_char = False
|
||||||
|
for c in dir:
|
||||||
|
if ord(c) > 255:
|
||||||
|
has_high_char = True
|
||||||
|
break
|
||||||
|
if has_high_char:
|
||||||
|
buf = array.zeros('c', buf_size)
|
||||||
|
kernel = win32.Kernel32.INSTANCE
|
||||||
|
if kernal.GetShortPathName(dir, buf, buf_size):
|
||||||
|
dir = jna.Native.toString(buf.tostring()).rstrip("\0")
|
||||||
|
|
||||||
|
return dir
|
||||||
|
|
||||||
|
if system == "win32":
|
||||||
|
try:
|
||||||
|
import win32com.shell
|
||||||
|
_get_win_folder = _get_win_folder_with_pywin32
|
||||||
|
except ImportError:
|
||||||
|
try:
|
||||||
|
from ctypes import windll
|
||||||
|
_get_win_folder = _get_win_folder_with_ctypes
|
||||||
|
except ImportError:
|
||||||
|
try:
|
||||||
|
import com.sun.jna
|
||||||
|
_get_win_folder = _get_win_folder_with_jna
|
||||||
|
except ImportError:
|
||||||
|
_get_win_folder = _get_win_folder_from_registry
|
||||||
|
|
||||||
|
|
||||||
|
#---- self test code
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
appname = "MyApp"
|
||||||
|
appauthor = "MyCompany"
|
||||||
|
|
||||||
|
props = ("user_data_dir", "site_data_dir",
|
||||||
|
"user_config_dir", "site_config_dir",
|
||||||
|
"user_cache_dir", "user_log_dir")
|
||||||
|
|
||||||
|
print("-- app dirs (with optional 'version')")
|
||||||
|
dirs = AppDirs(appname, appauthor, version="1.0")
|
||||||
|
for prop in props:
|
||||||
|
print("%s: %s" % (prop, getattr(dirs, prop)))
|
||||||
|
|
||||||
|
print("\n-- app dirs (without optional 'version')")
|
||||||
|
dirs = AppDirs(appname, appauthor)
|
||||||
|
for prop in props:
|
||||||
|
print("%s: %s" % (prop, getattr(dirs, prop)))
|
||||||
|
|
||||||
|
print("\n-- app dirs (without optional 'appauthor')")
|
||||||
|
dirs = AppDirs(appname)
|
||||||
|
for prop in props:
|
||||||
|
print("%s: %s" % (prop, getattr(dirs, prop)))
|
||||||
|
|
||||||
|
print("\n-- app dirs (with disabled 'appauthor')")
|
||||||
|
dirs = AppDirs(appname, appauthor=False)
|
||||||
|
for prop in props:
|
||||||
|
print("%s: %s" % (prop, getattr(dirs, prop)))
|
||||||
@@ -0,0 +1,21 @@
|
|||||||
|
# This file is dual licensed under the terms of the Apache License, Version
|
||||||
|
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
|
||||||
|
# for complete details.
|
||||||
|
from __future__ import absolute_import, division, print_function
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
"__title__", "__summary__", "__uri__", "__version__", "__author__",
|
||||||
|
"__email__", "__license__", "__copyright__",
|
||||||
|
]
|
||||||
|
|
||||||
|
__title__ = "packaging"
|
||||||
|
__summary__ = "Core utilities for Python packages"
|
||||||
|
__uri__ = "https://github.com/pypa/packaging"
|
||||||
|
|
||||||
|
__version__ = "16.8"
|
||||||
|
|
||||||
|
__author__ = "Donald Stufft and individual contributors"
|
||||||
|
__email__ = "donald@stufft.io"
|
||||||
|
|
||||||
|
__license__ = "BSD or Apache License, Version 2.0"
|
||||||
|
__copyright__ = "Copyright 2014-2016 %s" % __author__
|
||||||
@@ -0,0 +1,14 @@
|
|||||||
|
# This file is dual licensed under the terms of the Apache License, Version
|
||||||
|
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
|
||||||
|
# for complete details.
|
||||||
|
from __future__ import absolute_import, division, print_function
|
||||||
|
|
||||||
|
from .__about__ import (
|
||||||
|
__author__, __copyright__, __email__, __license__, __summary__, __title__,
|
||||||
|
__uri__, __version__
|
||||||
|
)
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
"__title__", "__summary__", "__uri__", "__version__", "__author__",
|
||||||
|
"__email__", "__license__", "__copyright__",
|
||||||
|
]
|
||||||
@@ -0,0 +1,30 @@
|
|||||||
|
# This file is dual licensed under the terms of the Apache License, Version
|
||||||
|
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
|
||||||
|
# for complete details.
|
||||||
|
from __future__ import absolute_import, division, print_function
|
||||||
|
|
||||||
|
import sys
|
||||||
|
|
||||||
|
|
||||||
|
PY2 = sys.version_info[0] == 2
|
||||||
|
PY3 = sys.version_info[0] == 3
|
||||||
|
|
||||||
|
# flake8: noqa
|
||||||
|
|
||||||
|
if PY3:
|
||||||
|
string_types = str,
|
||||||
|
else:
|
||||||
|
string_types = basestring,
|
||||||
|
|
||||||
|
|
||||||
|
def with_metaclass(meta, *bases):
|
||||||
|
"""
|
||||||
|
Create a base class with a metaclass.
|
||||||
|
"""
|
||||||
|
# This requires a bit of explanation: the basic idea is to make a dummy
|
||||||
|
# metaclass for one level of class instantiation that replaces itself with
|
||||||
|
# the actual metaclass.
|
||||||
|
class metaclass(meta):
|
||||||
|
def __new__(cls, name, this_bases, d):
|
||||||
|
return meta(name, bases, d)
|
||||||
|
return type.__new__(metaclass, 'temporary_class', (), {})
|
||||||
@@ -0,0 +1,68 @@
|
|||||||
|
# This file is dual licensed under the terms of the Apache License, Version
|
||||||
|
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
|
||||||
|
# for complete details.
|
||||||
|
from __future__ import absolute_import, division, print_function
|
||||||
|
|
||||||
|
|
||||||
|
class Infinity(object):
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return "Infinity"
|
||||||
|
|
||||||
|
def __hash__(self):
|
||||||
|
return hash(repr(self))
|
||||||
|
|
||||||
|
def __lt__(self, other):
|
||||||
|
return False
|
||||||
|
|
||||||
|
def __le__(self, other):
|
||||||
|
return False
|
||||||
|
|
||||||
|
def __eq__(self, other):
|
||||||
|
return isinstance(other, self.__class__)
|
||||||
|
|
||||||
|
def __ne__(self, other):
|
||||||
|
return not isinstance(other, self.__class__)
|
||||||
|
|
||||||
|
def __gt__(self, other):
|
||||||
|
return True
|
||||||
|
|
||||||
|
def __ge__(self, other):
|
||||||
|
return True
|
||||||
|
|
||||||
|
def __neg__(self):
|
||||||
|
return NegativeInfinity
|
||||||
|
|
||||||
|
Infinity = Infinity()
|
||||||
|
|
||||||
|
|
||||||
|
class NegativeInfinity(object):
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return "-Infinity"
|
||||||
|
|
||||||
|
def __hash__(self):
|
||||||
|
return hash(repr(self))
|
||||||
|
|
||||||
|
def __lt__(self, other):
|
||||||
|
return True
|
||||||
|
|
||||||
|
def __le__(self, other):
|
||||||
|
return True
|
||||||
|
|
||||||
|
def __eq__(self, other):
|
||||||
|
return isinstance(other, self.__class__)
|
||||||
|
|
||||||
|
def __ne__(self, other):
|
||||||
|
return not isinstance(other, self.__class__)
|
||||||
|
|
||||||
|
def __gt__(self, other):
|
||||||
|
return False
|
||||||
|
|
||||||
|
def __ge__(self, other):
|
||||||
|
return False
|
||||||
|
|
||||||
|
def __neg__(self):
|
||||||
|
return Infinity
|
||||||
|
|
||||||
|
NegativeInfinity = NegativeInfinity()
|
||||||
@@ -0,0 +1,301 @@
|
|||||||
|
# This file is dual licensed under the terms of the Apache License, Version
|
||||||
|
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
|
||||||
|
# for complete details.
|
||||||
|
from __future__ import absolute_import, division, print_function
|
||||||
|
|
||||||
|
import operator
|
||||||
|
import os
|
||||||
|
import platform
|
||||||
|
import sys
|
||||||
|
|
||||||
|
from pkg_resources.extern.pyparsing import ParseException, ParseResults, stringStart, stringEnd
|
||||||
|
from pkg_resources.extern.pyparsing import ZeroOrMore, Group, Forward, QuotedString
|
||||||
|
from pkg_resources.extern.pyparsing import Literal as L # noqa
|
||||||
|
|
||||||
|
from ._compat import string_types
|
||||||
|
from .specifiers import Specifier, InvalidSpecifier
|
||||||
|
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
"InvalidMarker", "UndefinedComparison", "UndefinedEnvironmentName",
|
||||||
|
"Marker", "default_environment",
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
class InvalidMarker(ValueError):
|
||||||
|
"""
|
||||||
|
An invalid marker was found, users should refer to PEP 508.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class UndefinedComparison(ValueError):
|
||||||
|
"""
|
||||||
|
An invalid operation was attempted on a value that doesn't support it.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class UndefinedEnvironmentName(ValueError):
|
||||||
|
"""
|
||||||
|
A name was attempted to be used that does not exist inside of the
|
||||||
|
environment.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class Node(object):
|
||||||
|
|
||||||
|
def __init__(self, value):
|
||||||
|
self.value = value
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return str(self.value)
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return "<{0}({1!r})>".format(self.__class__.__name__, str(self))
|
||||||
|
|
||||||
|
def serialize(self):
|
||||||
|
raise NotImplementedError
|
||||||
|
|
||||||
|
|
||||||
|
class Variable(Node):
|
||||||
|
|
||||||
|
def serialize(self):
|
||||||
|
return str(self)
|
||||||
|
|
||||||
|
|
||||||
|
class Value(Node):
|
||||||
|
|
||||||
|
def serialize(self):
|
||||||
|
return '"{0}"'.format(self)
|
||||||
|
|
||||||
|
|
||||||
|
class Op(Node):
|
||||||
|
|
||||||
|
def serialize(self):
|
||||||
|
return str(self)
|
||||||
|
|
||||||
|
|
||||||
|
VARIABLE = (
|
||||||
|
L("implementation_version") |
|
||||||
|
L("platform_python_implementation") |
|
||||||
|
L("implementation_name") |
|
||||||
|
L("python_full_version") |
|
||||||
|
L("platform_release") |
|
||||||
|
L("platform_version") |
|
||||||
|
L("platform_machine") |
|
||||||
|
L("platform_system") |
|
||||||
|
L("python_version") |
|
||||||
|
L("sys_platform") |
|
||||||
|
L("os_name") |
|
||||||
|
L("os.name") | # PEP-345
|
||||||
|
L("sys.platform") | # PEP-345
|
||||||
|
L("platform.version") | # PEP-345
|
||||||
|
L("platform.machine") | # PEP-345
|
||||||
|
L("platform.python_implementation") | # PEP-345
|
||||||
|
L("python_implementation") | # undocumented setuptools legacy
|
||||||
|
L("extra")
|
||||||
|
)
|
||||||
|
ALIASES = {
|
||||||
|
'os.name': 'os_name',
|
||||||
|
'sys.platform': 'sys_platform',
|
||||||
|
'platform.version': 'platform_version',
|
||||||
|
'platform.machine': 'platform_machine',
|
||||||
|
'platform.python_implementation': 'platform_python_implementation',
|
||||||
|
'python_implementation': 'platform_python_implementation'
|
||||||
|
}
|
||||||
|
VARIABLE.setParseAction(lambda s, l, t: Variable(ALIASES.get(t[0], t[0])))
|
||||||
|
|
||||||
|
VERSION_CMP = (
|
||||||
|
L("===") |
|
||||||
|
L("==") |
|
||||||
|
L(">=") |
|
||||||
|
L("<=") |
|
||||||
|
L("!=") |
|
||||||
|
L("~=") |
|
||||||
|
L(">") |
|
||||||
|
L("<")
|
||||||
|
)
|
||||||
|
|
||||||
|
MARKER_OP = VERSION_CMP | L("not in") | L("in")
|
||||||
|
MARKER_OP.setParseAction(lambda s, l, t: Op(t[0]))
|
||||||
|
|
||||||
|
MARKER_VALUE = QuotedString("'") | QuotedString('"')
|
||||||
|
MARKER_VALUE.setParseAction(lambda s, l, t: Value(t[0]))
|
||||||
|
|
||||||
|
BOOLOP = L("and") | L("or")
|
||||||
|
|
||||||
|
MARKER_VAR = VARIABLE | MARKER_VALUE
|
||||||
|
|
||||||
|
MARKER_ITEM = Group(MARKER_VAR + MARKER_OP + MARKER_VAR)
|
||||||
|
MARKER_ITEM.setParseAction(lambda s, l, t: tuple(t[0]))
|
||||||
|
|
||||||
|
LPAREN = L("(").suppress()
|
||||||
|
RPAREN = L(")").suppress()
|
||||||
|
|
||||||
|
MARKER_EXPR = Forward()
|
||||||
|
MARKER_ATOM = MARKER_ITEM | Group(LPAREN + MARKER_EXPR + RPAREN)
|
||||||
|
MARKER_EXPR << MARKER_ATOM + ZeroOrMore(BOOLOP + MARKER_EXPR)
|
||||||
|
|
||||||
|
MARKER = stringStart + MARKER_EXPR + stringEnd
|
||||||
|
|
||||||
|
|
||||||
|
def _coerce_parse_result(results):
|
||||||
|
if isinstance(results, ParseResults):
|
||||||
|
return [_coerce_parse_result(i) for i in results]
|
||||||
|
else:
|
||||||
|
return results
|
||||||
|
|
||||||
|
|
||||||
|
def _format_marker(marker, first=True):
|
||||||
|
assert isinstance(marker, (list, tuple, string_types))
|
||||||
|
|
||||||
|
# Sometimes we have a structure like [[...]] which is a single item list
|
||||||
|
# where the single item is itself it's own list. In that case we want skip
|
||||||
|
# the rest of this function so that we don't get extraneous () on the
|
||||||
|
# outside.
|
||||||
|
if (isinstance(marker, list) and len(marker) == 1 and
|
||||||
|
isinstance(marker[0], (list, tuple))):
|
||||||
|
return _format_marker(marker[0])
|
||||||
|
|
||||||
|
if isinstance(marker, list):
|
||||||
|
inner = (_format_marker(m, first=False) for m in marker)
|
||||||
|
if first:
|
||||||
|
return " ".join(inner)
|
||||||
|
else:
|
||||||
|
return "(" + " ".join(inner) + ")"
|
||||||
|
elif isinstance(marker, tuple):
|
||||||
|
return " ".join([m.serialize() for m in marker])
|
||||||
|
else:
|
||||||
|
return marker
|
||||||
|
|
||||||
|
|
||||||
|
_operators = {
|
||||||
|
"in": lambda lhs, rhs: lhs in rhs,
|
||||||
|
"not in": lambda lhs, rhs: lhs not in rhs,
|
||||||
|
"<": operator.lt,
|
||||||
|
"<=": operator.le,
|
||||||
|
"==": operator.eq,
|
||||||
|
"!=": operator.ne,
|
||||||
|
">=": operator.ge,
|
||||||
|
">": operator.gt,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def _eval_op(lhs, op, rhs):
|
||||||
|
try:
|
||||||
|
spec = Specifier("".join([op.serialize(), rhs]))
|
||||||
|
except InvalidSpecifier:
|
||||||
|
pass
|
||||||
|
else:
|
||||||
|
return spec.contains(lhs)
|
||||||
|
|
||||||
|
oper = _operators.get(op.serialize())
|
||||||
|
if oper is None:
|
||||||
|
raise UndefinedComparison(
|
||||||
|
"Undefined {0!r} on {1!r} and {2!r}.".format(op, lhs, rhs)
|
||||||
|
)
|
||||||
|
|
||||||
|
return oper(lhs, rhs)
|
||||||
|
|
||||||
|
|
||||||
|
_undefined = object()
|
||||||
|
|
||||||
|
|
||||||
|
def _get_env(environment, name):
|
||||||
|
value = environment.get(name, _undefined)
|
||||||
|
|
||||||
|
if value is _undefined:
|
||||||
|
raise UndefinedEnvironmentName(
|
||||||
|
"{0!r} does not exist in evaluation environment.".format(name)
|
||||||
|
)
|
||||||
|
|
||||||
|
return value
|
||||||
|
|
||||||
|
|
||||||
|
def _evaluate_markers(markers, environment):
|
||||||
|
groups = [[]]
|
||||||
|
|
||||||
|
for marker in markers:
|
||||||
|
assert isinstance(marker, (list, tuple, string_types))
|
||||||
|
|
||||||
|
if isinstance(marker, list):
|
||||||
|
groups[-1].append(_evaluate_markers(marker, environment))
|
||||||
|
elif isinstance(marker, tuple):
|
||||||
|
lhs, op, rhs = marker
|
||||||
|
|
||||||
|
if isinstance(lhs, Variable):
|
||||||
|
lhs_value = _get_env(environment, lhs.value)
|
||||||
|
rhs_value = rhs.value
|
||||||
|
else:
|
||||||
|
lhs_value = lhs.value
|
||||||
|
rhs_value = _get_env(environment, rhs.value)
|
||||||
|
|
||||||
|
groups[-1].append(_eval_op(lhs_value, op, rhs_value))
|
||||||
|
else:
|
||||||
|
assert marker in ["and", "or"]
|
||||||
|
if marker == "or":
|
||||||
|
groups.append([])
|
||||||
|
|
||||||
|
return any(all(item) for item in groups)
|
||||||
|
|
||||||
|
|
||||||
|
def format_full_version(info):
|
||||||
|
version = '{0.major}.{0.minor}.{0.micro}'.format(info)
|
||||||
|
kind = info.releaselevel
|
||||||
|
if kind != 'final':
|
||||||
|
version += kind[0] + str(info.serial)
|
||||||
|
return version
|
||||||
|
|
||||||
|
|
||||||
|
def default_environment():
|
||||||
|
if hasattr(sys, 'implementation'):
|
||||||
|
iver = format_full_version(sys.implementation.version)
|
||||||
|
implementation_name = sys.implementation.name
|
||||||
|
else:
|
||||||
|
iver = '0'
|
||||||
|
implementation_name = ''
|
||||||
|
|
||||||
|
return {
|
||||||
|
"implementation_name": implementation_name,
|
||||||
|
"implementation_version": iver,
|
||||||
|
"os_name": os.name,
|
||||||
|
"platform_machine": platform.machine(),
|
||||||
|
"platform_release": platform.release(),
|
||||||
|
"platform_system": platform.system(),
|
||||||
|
"platform_version": platform.version(),
|
||||||
|
"python_full_version": platform.python_version(),
|
||||||
|
"platform_python_implementation": platform.python_implementation(),
|
||||||
|
"python_version": platform.python_version()[:3],
|
||||||
|
"sys_platform": sys.platform,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
class Marker(object):
|
||||||
|
|
||||||
|
def __init__(self, marker):
|
||||||
|
try:
|
||||||
|
self._markers = _coerce_parse_result(MARKER.parseString(marker))
|
||||||
|
except ParseException as e:
|
||||||
|
err_str = "Invalid marker: {0!r}, parse error at {1!r}".format(
|
||||||
|
marker, marker[e.loc:e.loc + 8])
|
||||||
|
raise InvalidMarker(err_str)
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return _format_marker(self._markers)
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return "<Marker({0!r})>".format(str(self))
|
||||||
|
|
||||||
|
def evaluate(self, environment=None):
|
||||||
|
"""Evaluate a marker.
|
||||||
|
|
||||||
|
Return the boolean from evaluating the given marker against the
|
||||||
|
environment. environment is an optional argument to override all or
|
||||||
|
part of the determined environment.
|
||||||
|
|
||||||
|
The environment is determined from the current Python process.
|
||||||
|
"""
|
||||||
|
current_environment = default_environment()
|
||||||
|
if environment is not None:
|
||||||
|
current_environment.update(environment)
|
||||||
|
|
||||||
|
return _evaluate_markers(self._markers, current_environment)
|
||||||
@@ -0,0 +1,127 @@
|
|||||||
|
# This file is dual licensed under the terms of the Apache License, Version
|
||||||
|
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
|
||||||
|
# for complete details.
|
||||||
|
from __future__ import absolute_import, division, print_function
|
||||||
|
|
||||||
|
import string
|
||||||
|
import re
|
||||||
|
|
||||||
|
from pkg_resources.extern.pyparsing import stringStart, stringEnd, originalTextFor, ParseException
|
||||||
|
from pkg_resources.extern.pyparsing import ZeroOrMore, Word, Optional, Regex, Combine
|
||||||
|
from pkg_resources.extern.pyparsing import Literal as L # noqa
|
||||||
|
from pkg_resources.extern.six.moves.urllib import parse as urlparse
|
||||||
|
|
||||||
|
from .markers import MARKER_EXPR, Marker
|
||||||
|
from .specifiers import LegacySpecifier, Specifier, SpecifierSet
|
||||||
|
|
||||||
|
|
||||||
|
class InvalidRequirement(ValueError):
|
||||||
|
"""
|
||||||
|
An invalid requirement was found, users should refer to PEP 508.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
ALPHANUM = Word(string.ascii_letters + string.digits)
|
||||||
|
|
||||||
|
LBRACKET = L("[").suppress()
|
||||||
|
RBRACKET = L("]").suppress()
|
||||||
|
LPAREN = L("(").suppress()
|
||||||
|
RPAREN = L(")").suppress()
|
||||||
|
COMMA = L(",").suppress()
|
||||||
|
SEMICOLON = L(";").suppress()
|
||||||
|
AT = L("@").suppress()
|
||||||
|
|
||||||
|
PUNCTUATION = Word("-_.")
|
||||||
|
IDENTIFIER_END = ALPHANUM | (ZeroOrMore(PUNCTUATION) + ALPHANUM)
|
||||||
|
IDENTIFIER = Combine(ALPHANUM + ZeroOrMore(IDENTIFIER_END))
|
||||||
|
|
||||||
|
NAME = IDENTIFIER("name")
|
||||||
|
EXTRA = IDENTIFIER
|
||||||
|
|
||||||
|
URI = Regex(r'[^ ]+')("url")
|
||||||
|
URL = (AT + URI)
|
||||||
|
|
||||||
|
EXTRAS_LIST = EXTRA + ZeroOrMore(COMMA + EXTRA)
|
||||||
|
EXTRAS = (LBRACKET + Optional(EXTRAS_LIST) + RBRACKET)("extras")
|
||||||
|
|
||||||
|
VERSION_PEP440 = Regex(Specifier._regex_str, re.VERBOSE | re.IGNORECASE)
|
||||||
|
VERSION_LEGACY = Regex(LegacySpecifier._regex_str, re.VERBOSE | re.IGNORECASE)
|
||||||
|
|
||||||
|
VERSION_ONE = VERSION_PEP440 ^ VERSION_LEGACY
|
||||||
|
VERSION_MANY = Combine(VERSION_ONE + ZeroOrMore(COMMA + VERSION_ONE),
|
||||||
|
joinString=",", adjacent=False)("_raw_spec")
|
||||||
|
_VERSION_SPEC = Optional(((LPAREN + VERSION_MANY + RPAREN) | VERSION_MANY))
|
||||||
|
_VERSION_SPEC.setParseAction(lambda s, l, t: t._raw_spec or '')
|
||||||
|
|
||||||
|
VERSION_SPEC = originalTextFor(_VERSION_SPEC)("specifier")
|
||||||
|
VERSION_SPEC.setParseAction(lambda s, l, t: t[1])
|
||||||
|
|
||||||
|
MARKER_EXPR = originalTextFor(MARKER_EXPR())("marker")
|
||||||
|
MARKER_EXPR.setParseAction(
|
||||||
|
lambda s, l, t: Marker(s[t._original_start:t._original_end])
|
||||||
|
)
|
||||||
|
MARKER_SEPERATOR = SEMICOLON
|
||||||
|
MARKER = MARKER_SEPERATOR + MARKER_EXPR
|
||||||
|
|
||||||
|
VERSION_AND_MARKER = VERSION_SPEC + Optional(MARKER)
|
||||||
|
URL_AND_MARKER = URL + Optional(MARKER)
|
||||||
|
|
||||||
|
NAMED_REQUIREMENT = \
|
||||||
|
NAME + Optional(EXTRAS) + (URL_AND_MARKER | VERSION_AND_MARKER)
|
||||||
|
|
||||||
|
REQUIREMENT = stringStart + NAMED_REQUIREMENT + stringEnd
|
||||||
|
|
||||||
|
|
||||||
|
class Requirement(object):
|
||||||
|
"""Parse a requirement.
|
||||||
|
|
||||||
|
Parse a given requirement string into its parts, such as name, specifier,
|
||||||
|
URL, and extras. Raises InvalidRequirement on a badly-formed requirement
|
||||||
|
string.
|
||||||
|
"""
|
||||||
|
|
||||||
|
# TODO: Can we test whether something is contained within a requirement?
|
||||||
|
# If so how do we do that? Do we need to test against the _name_ of
|
||||||
|
# the thing as well as the version? What about the markers?
|
||||||
|
# TODO: Can we normalize the name and extra name?
|
||||||
|
|
||||||
|
def __init__(self, requirement_string):
|
||||||
|
try:
|
||||||
|
req = REQUIREMENT.parseString(requirement_string)
|
||||||
|
except ParseException as e:
|
||||||
|
raise InvalidRequirement(
|
||||||
|
"Invalid requirement, parse error at \"{0!r}\"".format(
|
||||||
|
requirement_string[e.loc:e.loc + 8]))
|
||||||
|
|
||||||
|
self.name = req.name
|
||||||
|
if req.url:
|
||||||
|
parsed_url = urlparse.urlparse(req.url)
|
||||||
|
if not (parsed_url.scheme and parsed_url.netloc) or (
|
||||||
|
not parsed_url.scheme and not parsed_url.netloc):
|
||||||
|
raise InvalidRequirement("Invalid URL given")
|
||||||
|
self.url = req.url
|
||||||
|
else:
|
||||||
|
self.url = None
|
||||||
|
self.extras = set(req.extras.asList() if req.extras else [])
|
||||||
|
self.specifier = SpecifierSet(req.specifier)
|
||||||
|
self.marker = req.marker if req.marker else None
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
parts = [self.name]
|
||||||
|
|
||||||
|
if self.extras:
|
||||||
|
parts.append("[{0}]".format(",".join(sorted(self.extras))))
|
||||||
|
|
||||||
|
if self.specifier:
|
||||||
|
parts.append(str(self.specifier))
|
||||||
|
|
||||||
|
if self.url:
|
||||||
|
parts.append("@ {0}".format(self.url))
|
||||||
|
|
||||||
|
if self.marker:
|
||||||
|
parts.append("; {0}".format(self.marker))
|
||||||
|
|
||||||
|
return "".join(parts)
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return "<Requirement({0!r})>".format(str(self))
|
||||||
@@ -0,0 +1,774 @@
|
|||||||
|
# This file is dual licensed under the terms of the Apache License, Version
|
||||||
|
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
|
||||||
|
# for complete details.
|
||||||
|
from __future__ import absolute_import, division, print_function
|
||||||
|
|
||||||
|
import abc
|
||||||
|
import functools
|
||||||
|
import itertools
|
||||||
|
import re
|
||||||
|
|
||||||
|
from ._compat import string_types, with_metaclass
|
||||||
|
from .version import Version, LegacyVersion, parse
|
||||||
|
|
||||||
|
|
||||||
|
class InvalidSpecifier(ValueError):
|
||||||
|
"""
|
||||||
|
An invalid specifier was found, users should refer to PEP 440.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class BaseSpecifier(with_metaclass(abc.ABCMeta, object)):
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def __str__(self):
|
||||||
|
"""
|
||||||
|
Returns the str representation of this Specifier like object. This
|
||||||
|
should be representative of the Specifier itself.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def __hash__(self):
|
||||||
|
"""
|
||||||
|
Returns a hash value for this Specifier like object.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def __eq__(self, other):
|
||||||
|
"""
|
||||||
|
Returns a boolean representing whether or not the two Specifier like
|
||||||
|
objects are equal.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def __ne__(self, other):
|
||||||
|
"""
|
||||||
|
Returns a boolean representing whether or not the two Specifier like
|
||||||
|
objects are not equal.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@abc.abstractproperty
|
||||||
|
def prereleases(self):
|
||||||
|
"""
|
||||||
|
Returns whether or not pre-releases as a whole are allowed by this
|
||||||
|
specifier.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@prereleases.setter
|
||||||
|
def prereleases(self, value):
|
||||||
|
"""
|
||||||
|
Sets whether or not pre-releases as a whole are allowed by this
|
||||||
|
specifier.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def contains(self, item, prereleases=None):
|
||||||
|
"""
|
||||||
|
Determines if the given item is contained within this specifier.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def filter(self, iterable, prereleases=None):
|
||||||
|
"""
|
||||||
|
Takes an iterable of items and filters them so that only items which
|
||||||
|
are contained within this specifier are allowed in it.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class _IndividualSpecifier(BaseSpecifier):
|
||||||
|
|
||||||
|
_operators = {}
|
||||||
|
|
||||||
|
def __init__(self, spec="", prereleases=None):
|
||||||
|
match = self._regex.search(spec)
|
||||||
|
if not match:
|
||||||
|
raise InvalidSpecifier("Invalid specifier: '{0}'".format(spec))
|
||||||
|
|
||||||
|
self._spec = (
|
||||||
|
match.group("operator").strip(),
|
||||||
|
match.group("version").strip(),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Store whether or not this Specifier should accept prereleases
|
||||||
|
self._prereleases = prereleases
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
pre = (
|
||||||
|
", prereleases={0!r}".format(self.prereleases)
|
||||||
|
if self._prereleases is not None
|
||||||
|
else ""
|
||||||
|
)
|
||||||
|
|
||||||
|
return "<{0}({1!r}{2})>".format(
|
||||||
|
self.__class__.__name__,
|
||||||
|
str(self),
|
||||||
|
pre,
|
||||||
|
)
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return "{0}{1}".format(*self._spec)
|
||||||
|
|
||||||
|
def __hash__(self):
|
||||||
|
return hash(self._spec)
|
||||||
|
|
||||||
|
def __eq__(self, other):
|
||||||
|
if isinstance(other, string_types):
|
||||||
|
try:
|
||||||
|
other = self.__class__(other)
|
||||||
|
except InvalidSpecifier:
|
||||||
|
return NotImplemented
|
||||||
|
elif not isinstance(other, self.__class__):
|
||||||
|
return NotImplemented
|
||||||
|
|
||||||
|
return self._spec == other._spec
|
||||||
|
|
||||||
|
def __ne__(self, other):
|
||||||
|
if isinstance(other, string_types):
|
||||||
|
try:
|
||||||
|
other = self.__class__(other)
|
||||||
|
except InvalidSpecifier:
|
||||||
|
return NotImplemented
|
||||||
|
elif not isinstance(other, self.__class__):
|
||||||
|
return NotImplemented
|
||||||
|
|
||||||
|
return self._spec != other._spec
|
||||||
|
|
||||||
|
def _get_operator(self, op):
|
||||||
|
return getattr(self, "_compare_{0}".format(self._operators[op]))
|
||||||
|
|
||||||
|
def _coerce_version(self, version):
|
||||||
|
if not isinstance(version, (LegacyVersion, Version)):
|
||||||
|
version = parse(version)
|
||||||
|
return version
|
||||||
|
|
||||||
|
@property
|
||||||
|
def operator(self):
|
||||||
|
return self._spec[0]
|
||||||
|
|
||||||
|
@property
|
||||||
|
def version(self):
|
||||||
|
return self._spec[1]
|
||||||
|
|
||||||
|
@property
|
||||||
|
def prereleases(self):
|
||||||
|
return self._prereleases
|
||||||
|
|
||||||
|
@prereleases.setter
|
||||||
|
def prereleases(self, value):
|
||||||
|
self._prereleases = value
|
||||||
|
|
||||||
|
def __contains__(self, item):
|
||||||
|
return self.contains(item)
|
||||||
|
|
||||||
|
def contains(self, item, prereleases=None):
|
||||||
|
# Determine if prereleases are to be allowed or not.
|
||||||
|
if prereleases is None:
|
||||||
|
prereleases = self.prereleases
|
||||||
|
|
||||||
|
# Normalize item to a Version or LegacyVersion, this allows us to have
|
||||||
|
# a shortcut for ``"2.0" in Specifier(">=2")
|
||||||
|
item = self._coerce_version(item)
|
||||||
|
|
||||||
|
# Determine if we should be supporting prereleases in this specifier
|
||||||
|
# or not, if we do not support prereleases than we can short circuit
|
||||||
|
# logic if this version is a prereleases.
|
||||||
|
if item.is_prerelease and not prereleases:
|
||||||
|
return False
|
||||||
|
|
||||||
|
# Actually do the comparison to determine if this item is contained
|
||||||
|
# within this Specifier or not.
|
||||||
|
return self._get_operator(self.operator)(item, self.version)
|
||||||
|
|
||||||
|
def filter(self, iterable, prereleases=None):
|
||||||
|
yielded = False
|
||||||
|
found_prereleases = []
|
||||||
|
|
||||||
|
kw = {"prereleases": prereleases if prereleases is not None else True}
|
||||||
|
|
||||||
|
# Attempt to iterate over all the values in the iterable and if any of
|
||||||
|
# them match, yield them.
|
||||||
|
for version in iterable:
|
||||||
|
parsed_version = self._coerce_version(version)
|
||||||
|
|
||||||
|
if self.contains(parsed_version, **kw):
|
||||||
|
# If our version is a prerelease, and we were not set to allow
|
||||||
|
# prereleases, then we'll store it for later incase nothing
|
||||||
|
# else matches this specifier.
|
||||||
|
if (parsed_version.is_prerelease and not
|
||||||
|
(prereleases or self.prereleases)):
|
||||||
|
found_prereleases.append(version)
|
||||||
|
# Either this is not a prerelease, or we should have been
|
||||||
|
# accepting prereleases from the begining.
|
||||||
|
else:
|
||||||
|
yielded = True
|
||||||
|
yield version
|
||||||
|
|
||||||
|
# Now that we've iterated over everything, determine if we've yielded
|
||||||
|
# any values, and if we have not and we have any prereleases stored up
|
||||||
|
# then we will go ahead and yield the prereleases.
|
||||||
|
if not yielded and found_prereleases:
|
||||||
|
for version in found_prereleases:
|
||||||
|
yield version
|
||||||
|
|
||||||
|
|
||||||
|
class LegacySpecifier(_IndividualSpecifier):
|
||||||
|
|
||||||
|
_regex_str = (
|
||||||
|
r"""
|
||||||
|
(?P<operator>(==|!=|<=|>=|<|>))
|
||||||
|
\s*
|
||||||
|
(?P<version>
|
||||||
|
[^,;\s)]* # Since this is a "legacy" specifier, and the version
|
||||||
|
# string can be just about anything, we match everything
|
||||||
|
# except for whitespace, a semi-colon for marker support,
|
||||||
|
# a closing paren since versions can be enclosed in
|
||||||
|
# them, and a comma since it's a version separator.
|
||||||
|
)
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
|
||||||
|
_regex = re.compile(
|
||||||
|
r"^\s*" + _regex_str + r"\s*$", re.VERBOSE | re.IGNORECASE)
|
||||||
|
|
||||||
|
_operators = {
|
||||||
|
"==": "equal",
|
||||||
|
"!=": "not_equal",
|
||||||
|
"<=": "less_than_equal",
|
||||||
|
">=": "greater_than_equal",
|
||||||
|
"<": "less_than",
|
||||||
|
">": "greater_than",
|
||||||
|
}
|
||||||
|
|
||||||
|
def _coerce_version(self, version):
|
||||||
|
if not isinstance(version, LegacyVersion):
|
||||||
|
version = LegacyVersion(str(version))
|
||||||
|
return version
|
||||||
|
|
||||||
|
def _compare_equal(self, prospective, spec):
|
||||||
|
return prospective == self._coerce_version(spec)
|
||||||
|
|
||||||
|
def _compare_not_equal(self, prospective, spec):
|
||||||
|
return prospective != self._coerce_version(spec)
|
||||||
|
|
||||||
|
def _compare_less_than_equal(self, prospective, spec):
|
||||||
|
return prospective <= self._coerce_version(spec)
|
||||||
|
|
||||||
|
def _compare_greater_than_equal(self, prospective, spec):
|
||||||
|
return prospective >= self._coerce_version(spec)
|
||||||
|
|
||||||
|
def _compare_less_than(self, prospective, spec):
|
||||||
|
return prospective < self._coerce_version(spec)
|
||||||
|
|
||||||
|
def _compare_greater_than(self, prospective, spec):
|
||||||
|
return prospective > self._coerce_version(spec)
|
||||||
|
|
||||||
|
|
||||||
|
def _require_version_compare(fn):
|
||||||
|
@functools.wraps(fn)
|
||||||
|
def wrapped(self, prospective, spec):
|
||||||
|
if not isinstance(prospective, Version):
|
||||||
|
return False
|
||||||
|
return fn(self, prospective, spec)
|
||||||
|
return wrapped
|
||||||
|
|
||||||
|
|
||||||
|
class Specifier(_IndividualSpecifier):
|
||||||
|
|
||||||
|
_regex_str = (
|
||||||
|
r"""
|
||||||
|
(?P<operator>(~=|==|!=|<=|>=|<|>|===))
|
||||||
|
(?P<version>
|
||||||
|
(?:
|
||||||
|
# The identity operators allow for an escape hatch that will
|
||||||
|
# do an exact string match of the version you wish to install.
|
||||||
|
# This will not be parsed by PEP 440 and we cannot determine
|
||||||
|
# any semantic meaning from it. This operator is discouraged
|
||||||
|
# but included entirely as an escape hatch.
|
||||||
|
(?<====) # Only match for the identity operator
|
||||||
|
\s*
|
||||||
|
[^\s]* # We just match everything, except for whitespace
|
||||||
|
# since we are only testing for strict identity.
|
||||||
|
)
|
||||||
|
|
|
||||||
|
(?:
|
||||||
|
# The (non)equality operators allow for wild card and local
|
||||||
|
# versions to be specified so we have to define these two
|
||||||
|
# operators separately to enable that.
|
||||||
|
(?<===|!=) # Only match for equals and not equals
|
||||||
|
|
||||||
|
\s*
|
||||||
|
v?
|
||||||
|
(?:[0-9]+!)? # epoch
|
||||||
|
[0-9]+(?:\.[0-9]+)* # release
|
||||||
|
(?: # pre release
|
||||||
|
[-_\.]?
|
||||||
|
(a|b|c|rc|alpha|beta|pre|preview)
|
||||||
|
[-_\.]?
|
||||||
|
[0-9]*
|
||||||
|
)?
|
||||||
|
(?: # post release
|
||||||
|
(?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
|
||||||
|
)?
|
||||||
|
|
||||||
|
# You cannot use a wild card and a dev or local version
|
||||||
|
# together so group them with a | and make them optional.
|
||||||
|
(?:
|
||||||
|
(?:[-_\.]?dev[-_\.]?[0-9]*)? # dev release
|
||||||
|
(?:\+[a-z0-9]+(?:[-_\.][a-z0-9]+)*)? # local
|
||||||
|
|
|
||||||
|
\.\* # Wild card syntax of .*
|
||||||
|
)?
|
||||||
|
)
|
||||||
|
|
|
||||||
|
(?:
|
||||||
|
# The compatible operator requires at least two digits in the
|
||||||
|
# release segment.
|
||||||
|
(?<=~=) # Only match for the compatible operator
|
||||||
|
|
||||||
|
\s*
|
||||||
|
v?
|
||||||
|
(?:[0-9]+!)? # epoch
|
||||||
|
[0-9]+(?:\.[0-9]+)+ # release (We have a + instead of a *)
|
||||||
|
(?: # pre release
|
||||||
|
[-_\.]?
|
||||||
|
(a|b|c|rc|alpha|beta|pre|preview)
|
||||||
|
[-_\.]?
|
||||||
|
[0-9]*
|
||||||
|
)?
|
||||||
|
(?: # post release
|
||||||
|
(?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
|
||||||
|
)?
|
||||||
|
(?:[-_\.]?dev[-_\.]?[0-9]*)? # dev release
|
||||||
|
)
|
||||||
|
|
|
||||||
|
(?:
|
||||||
|
# All other operators only allow a sub set of what the
|
||||||
|
# (non)equality operators do. Specifically they do not allow
|
||||||
|
# local versions to be specified nor do they allow the prefix
|
||||||
|
# matching wild cards.
|
||||||
|
(?<!==|!=|~=) # We have special cases for these
|
||||||
|
# operators so we want to make sure they
|
||||||
|
# don't match here.
|
||||||
|
|
||||||
|
\s*
|
||||||
|
v?
|
||||||
|
(?:[0-9]+!)? # epoch
|
||||||
|
[0-9]+(?:\.[0-9]+)* # release
|
||||||
|
(?: # pre release
|
||||||
|
[-_\.]?
|
||||||
|
(a|b|c|rc|alpha|beta|pre|preview)
|
||||||
|
[-_\.]?
|
||||||
|
[0-9]*
|
||||||
|
)?
|
||||||
|
(?: # post release
|
||||||
|
(?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
|
||||||
|
)?
|
||||||
|
(?:[-_\.]?dev[-_\.]?[0-9]*)? # dev release
|
||||||
|
)
|
||||||
|
)
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
|
||||||
|
_regex = re.compile(
|
||||||
|
r"^\s*" + _regex_str + r"\s*$", re.VERBOSE | re.IGNORECASE)
|
||||||
|
|
||||||
|
_operators = {
|
||||||
|
"~=": "compatible",
|
||||||
|
"==": "equal",
|
||||||
|
"!=": "not_equal",
|
||||||
|
"<=": "less_than_equal",
|
||||||
|
">=": "greater_than_equal",
|
||||||
|
"<": "less_than",
|
||||||
|
">": "greater_than",
|
||||||
|
"===": "arbitrary",
|
||||||
|
}
|
||||||
|
|
||||||
|
@_require_version_compare
|
||||||
|
def _compare_compatible(self, prospective, spec):
|
||||||
|
# Compatible releases have an equivalent combination of >= and ==. That
|
||||||
|
# is that ~=2.2 is equivalent to >=2.2,==2.*. This allows us to
|
||||||
|
# implement this in terms of the other specifiers instead of
|
||||||
|
# implementing it ourselves. The only thing we need to do is construct
|
||||||
|
# the other specifiers.
|
||||||
|
|
||||||
|
# We want everything but the last item in the version, but we want to
|
||||||
|
# ignore post and dev releases and we want to treat the pre-release as
|
||||||
|
# it's own separate segment.
|
||||||
|
prefix = ".".join(
|
||||||
|
list(
|
||||||
|
itertools.takewhile(
|
||||||
|
lambda x: (not x.startswith("post") and not
|
||||||
|
x.startswith("dev")),
|
||||||
|
_version_split(spec),
|
||||||
|
)
|
||||||
|
)[:-1]
|
||||||
|
)
|
||||||
|
|
||||||
|
# Add the prefix notation to the end of our string
|
||||||
|
prefix += ".*"
|
||||||
|
|
||||||
|
return (self._get_operator(">=")(prospective, spec) and
|
||||||
|
self._get_operator("==")(prospective, prefix))
|
||||||
|
|
||||||
|
@_require_version_compare
|
||||||
|
def _compare_equal(self, prospective, spec):
|
||||||
|
# We need special logic to handle prefix matching
|
||||||
|
if spec.endswith(".*"):
|
||||||
|
# In the case of prefix matching we want to ignore local segment.
|
||||||
|
prospective = Version(prospective.public)
|
||||||
|
# Split the spec out by dots, and pretend that there is an implicit
|
||||||
|
# dot in between a release segment and a pre-release segment.
|
||||||
|
spec = _version_split(spec[:-2]) # Remove the trailing .*
|
||||||
|
|
||||||
|
# Split the prospective version out by dots, and pretend that there
|
||||||
|
# is an implicit dot in between a release segment and a pre-release
|
||||||
|
# segment.
|
||||||
|
prospective = _version_split(str(prospective))
|
||||||
|
|
||||||
|
# Shorten the prospective version to be the same length as the spec
|
||||||
|
# so that we can determine if the specifier is a prefix of the
|
||||||
|
# prospective version or not.
|
||||||
|
prospective = prospective[:len(spec)]
|
||||||
|
|
||||||
|
# Pad out our two sides with zeros so that they both equal the same
|
||||||
|
# length.
|
||||||
|
spec, prospective = _pad_version(spec, prospective)
|
||||||
|
else:
|
||||||
|
# Convert our spec string into a Version
|
||||||
|
spec = Version(spec)
|
||||||
|
|
||||||
|
# If the specifier does not have a local segment, then we want to
|
||||||
|
# act as if the prospective version also does not have a local
|
||||||
|
# segment.
|
||||||
|
if not spec.local:
|
||||||
|
prospective = Version(prospective.public)
|
||||||
|
|
||||||
|
return prospective == spec
|
||||||
|
|
||||||
|
@_require_version_compare
|
||||||
|
def _compare_not_equal(self, prospective, spec):
|
||||||
|
return not self._compare_equal(prospective, spec)
|
||||||
|
|
||||||
|
@_require_version_compare
|
||||||
|
def _compare_less_than_equal(self, prospective, spec):
|
||||||
|
return prospective <= Version(spec)
|
||||||
|
|
||||||
|
@_require_version_compare
|
||||||
|
def _compare_greater_than_equal(self, prospective, spec):
|
||||||
|
return prospective >= Version(spec)
|
||||||
|
|
||||||
|
@_require_version_compare
|
||||||
|
def _compare_less_than(self, prospective, spec):
|
||||||
|
# Convert our spec to a Version instance, since we'll want to work with
|
||||||
|
# it as a version.
|
||||||
|
spec = Version(spec)
|
||||||
|
|
||||||
|
# Check to see if the prospective version is less than the spec
|
||||||
|
# version. If it's not we can short circuit and just return False now
|
||||||
|
# instead of doing extra unneeded work.
|
||||||
|
if not prospective < spec:
|
||||||
|
return False
|
||||||
|
|
||||||
|
# This special case is here so that, unless the specifier itself
|
||||||
|
# includes is a pre-release version, that we do not accept pre-release
|
||||||
|
# versions for the version mentioned in the specifier (e.g. <3.1 should
|
||||||
|
# not match 3.1.dev0, but should match 3.0.dev0).
|
||||||
|
if not spec.is_prerelease and prospective.is_prerelease:
|
||||||
|
if Version(prospective.base_version) == Version(spec.base_version):
|
||||||
|
return False
|
||||||
|
|
||||||
|
# If we've gotten to here, it means that prospective version is both
|
||||||
|
# less than the spec version *and* it's not a pre-release of the same
|
||||||
|
# version in the spec.
|
||||||
|
return True
|
||||||
|
|
||||||
|
@_require_version_compare
|
||||||
|
def _compare_greater_than(self, prospective, spec):
|
||||||
|
# Convert our spec to a Version instance, since we'll want to work with
|
||||||
|
# it as a version.
|
||||||
|
spec = Version(spec)
|
||||||
|
|
||||||
|
# Check to see if the prospective version is greater than the spec
|
||||||
|
# version. If it's not we can short circuit and just return False now
|
||||||
|
# instead of doing extra unneeded work.
|
||||||
|
if not prospective > spec:
|
||||||
|
return False
|
||||||
|
|
||||||
|
# This special case is here so that, unless the specifier itself
|
||||||
|
# includes is a post-release version, that we do not accept
|
||||||
|
# post-release versions for the version mentioned in the specifier
|
||||||
|
# (e.g. >3.1 should not match 3.0.post0, but should match 3.2.post0).
|
||||||
|
if not spec.is_postrelease and prospective.is_postrelease:
|
||||||
|
if Version(prospective.base_version) == Version(spec.base_version):
|
||||||
|
return False
|
||||||
|
|
||||||
|
# Ensure that we do not allow a local version of the version mentioned
|
||||||
|
# in the specifier, which is techincally greater than, to match.
|
||||||
|
if prospective.local is not None:
|
||||||
|
if Version(prospective.base_version) == Version(spec.base_version):
|
||||||
|
return False
|
||||||
|
|
||||||
|
# If we've gotten to here, it means that prospective version is both
|
||||||
|
# greater than the spec version *and* it's not a pre-release of the
|
||||||
|
# same version in the spec.
|
||||||
|
return True
|
||||||
|
|
||||||
|
def _compare_arbitrary(self, prospective, spec):
|
||||||
|
return str(prospective).lower() == str(spec).lower()
|
||||||
|
|
||||||
|
@property
|
||||||
|
def prereleases(self):
|
||||||
|
# If there is an explicit prereleases set for this, then we'll just
|
||||||
|
# blindly use that.
|
||||||
|
if self._prereleases is not None:
|
||||||
|
return self._prereleases
|
||||||
|
|
||||||
|
# Look at all of our specifiers and determine if they are inclusive
|
||||||
|
# operators, and if they are if they are including an explicit
|
||||||
|
# prerelease.
|
||||||
|
operator, version = self._spec
|
||||||
|
if operator in ["==", ">=", "<=", "~=", "==="]:
|
||||||
|
# The == specifier can include a trailing .*, if it does we
|
||||||
|
# want to remove before parsing.
|
||||||
|
if operator == "==" and version.endswith(".*"):
|
||||||
|
version = version[:-2]
|
||||||
|
|
||||||
|
# Parse the version, and if it is a pre-release than this
|
||||||
|
# specifier allows pre-releases.
|
||||||
|
if parse(version).is_prerelease:
|
||||||
|
return True
|
||||||
|
|
||||||
|
return False
|
||||||
|
|
||||||
|
@prereleases.setter
|
||||||
|
def prereleases(self, value):
|
||||||
|
self._prereleases = value
|
||||||
|
|
||||||
|
|
||||||
|
_prefix_regex = re.compile(r"^([0-9]+)((?:a|b|c|rc)[0-9]+)$")
|
||||||
|
|
||||||
|
|
||||||
|
def _version_split(version):
|
||||||
|
result = []
|
||||||
|
for item in version.split("."):
|
||||||
|
match = _prefix_regex.search(item)
|
||||||
|
if match:
|
||||||
|
result.extend(match.groups())
|
||||||
|
else:
|
||||||
|
result.append(item)
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def _pad_version(left, right):
|
||||||
|
left_split, right_split = [], []
|
||||||
|
|
||||||
|
# Get the release segment of our versions
|
||||||
|
left_split.append(list(itertools.takewhile(lambda x: x.isdigit(), left)))
|
||||||
|
right_split.append(list(itertools.takewhile(lambda x: x.isdigit(), right)))
|
||||||
|
|
||||||
|
# Get the rest of our versions
|
||||||
|
left_split.append(left[len(left_split[0]):])
|
||||||
|
right_split.append(right[len(right_split[0]):])
|
||||||
|
|
||||||
|
# Insert our padding
|
||||||
|
left_split.insert(
|
||||||
|
1,
|
||||||
|
["0"] * max(0, len(right_split[0]) - len(left_split[0])),
|
||||||
|
)
|
||||||
|
right_split.insert(
|
||||||
|
1,
|
||||||
|
["0"] * max(0, len(left_split[0]) - len(right_split[0])),
|
||||||
|
)
|
||||||
|
|
||||||
|
return (
|
||||||
|
list(itertools.chain(*left_split)),
|
||||||
|
list(itertools.chain(*right_split)),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class SpecifierSet(BaseSpecifier):
|
||||||
|
|
||||||
|
def __init__(self, specifiers="", prereleases=None):
|
||||||
|
# Split on , to break each indidivual specifier into it's own item, and
|
||||||
|
# strip each item to remove leading/trailing whitespace.
|
||||||
|
specifiers = [s.strip() for s in specifiers.split(",") if s.strip()]
|
||||||
|
|
||||||
|
# Parsed each individual specifier, attempting first to make it a
|
||||||
|
# Specifier and falling back to a LegacySpecifier.
|
||||||
|
parsed = set()
|
||||||
|
for specifier in specifiers:
|
||||||
|
try:
|
||||||
|
parsed.add(Specifier(specifier))
|
||||||
|
except InvalidSpecifier:
|
||||||
|
parsed.add(LegacySpecifier(specifier))
|
||||||
|
|
||||||
|
# Turn our parsed specifiers into a frozen set and save them for later.
|
||||||
|
self._specs = frozenset(parsed)
|
||||||
|
|
||||||
|
# Store our prereleases value so we can use it later to determine if
|
||||||
|
# we accept prereleases or not.
|
||||||
|
self._prereleases = prereleases
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
pre = (
|
||||||
|
", prereleases={0!r}".format(self.prereleases)
|
||||||
|
if self._prereleases is not None
|
||||||
|
else ""
|
||||||
|
)
|
||||||
|
|
||||||
|
return "<SpecifierSet({0!r}{1})>".format(str(self), pre)
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return ",".join(sorted(str(s) for s in self._specs))
|
||||||
|
|
||||||
|
def __hash__(self):
|
||||||
|
return hash(self._specs)
|
||||||
|
|
||||||
|
def __and__(self, other):
|
||||||
|
if isinstance(other, string_types):
|
||||||
|
other = SpecifierSet(other)
|
||||||
|
elif not isinstance(other, SpecifierSet):
|
||||||
|
return NotImplemented
|
||||||
|
|
||||||
|
specifier = SpecifierSet()
|
||||||
|
specifier._specs = frozenset(self._specs | other._specs)
|
||||||
|
|
||||||
|
if self._prereleases is None and other._prereleases is not None:
|
||||||
|
specifier._prereleases = other._prereleases
|
||||||
|
elif self._prereleases is not None and other._prereleases is None:
|
||||||
|
specifier._prereleases = self._prereleases
|
||||||
|
elif self._prereleases == other._prereleases:
|
||||||
|
specifier._prereleases = self._prereleases
|
||||||
|
else:
|
||||||
|
raise ValueError(
|
||||||
|
"Cannot combine SpecifierSets with True and False prerelease "
|
||||||
|
"overrides."
|
||||||
|
)
|
||||||
|
|
||||||
|
return specifier
|
||||||
|
|
||||||
|
def __eq__(self, other):
|
||||||
|
if isinstance(other, string_types):
|
||||||
|
other = SpecifierSet(other)
|
||||||
|
elif isinstance(other, _IndividualSpecifier):
|
||||||
|
other = SpecifierSet(str(other))
|
||||||
|
elif not isinstance(other, SpecifierSet):
|
||||||
|
return NotImplemented
|
||||||
|
|
||||||
|
return self._specs == other._specs
|
||||||
|
|
||||||
|
def __ne__(self, other):
|
||||||
|
if isinstance(other, string_types):
|
||||||
|
other = SpecifierSet(other)
|
||||||
|
elif isinstance(other, _IndividualSpecifier):
|
||||||
|
other = SpecifierSet(str(other))
|
||||||
|
elif not isinstance(other, SpecifierSet):
|
||||||
|
return NotImplemented
|
||||||
|
|
||||||
|
return self._specs != other._specs
|
||||||
|
|
||||||
|
def __len__(self):
|
||||||
|
return len(self._specs)
|
||||||
|
|
||||||
|
def __iter__(self):
|
||||||
|
return iter(self._specs)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def prereleases(self):
|
||||||
|
# If we have been given an explicit prerelease modifier, then we'll
|
||||||
|
# pass that through here.
|
||||||
|
if self._prereleases is not None:
|
||||||
|
return self._prereleases
|
||||||
|
|
||||||
|
# If we don't have any specifiers, and we don't have a forced value,
|
||||||
|
# then we'll just return None since we don't know if this should have
|
||||||
|
# pre-releases or not.
|
||||||
|
if not self._specs:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Otherwise we'll see if any of the given specifiers accept
|
||||||
|
# prereleases, if any of them do we'll return True, otherwise False.
|
||||||
|
return any(s.prereleases for s in self._specs)
|
||||||
|
|
||||||
|
@prereleases.setter
|
||||||
|
def prereleases(self, value):
|
||||||
|
self._prereleases = value
|
||||||
|
|
||||||
|
def __contains__(self, item):
|
||||||
|
return self.contains(item)
|
||||||
|
|
||||||
|
def contains(self, item, prereleases=None):
|
||||||
|
# Ensure that our item is a Version or LegacyVersion instance.
|
||||||
|
if not isinstance(item, (LegacyVersion, Version)):
|
||||||
|
item = parse(item)
|
||||||
|
|
||||||
|
# Determine if we're forcing a prerelease or not, if we're not forcing
|
||||||
|
# one for this particular filter call, then we'll use whatever the
|
||||||
|
# SpecifierSet thinks for whether or not we should support prereleases.
|
||||||
|
if prereleases is None:
|
||||||
|
prereleases = self.prereleases
|
||||||
|
|
||||||
|
# We can determine if we're going to allow pre-releases by looking to
|
||||||
|
# see if any of the underlying items supports them. If none of them do
|
||||||
|
# and this item is a pre-release then we do not allow it and we can
|
||||||
|
# short circuit that here.
|
||||||
|
# Note: This means that 1.0.dev1 would not be contained in something
|
||||||
|
# like >=1.0.devabc however it would be in >=1.0.debabc,>0.0.dev0
|
||||||
|
if not prereleases and item.is_prerelease:
|
||||||
|
return False
|
||||||
|
|
||||||
|
# We simply dispatch to the underlying specs here to make sure that the
|
||||||
|
# given version is contained within all of them.
|
||||||
|
# Note: This use of all() here means that an empty set of specifiers
|
||||||
|
# will always return True, this is an explicit design decision.
|
||||||
|
return all(
|
||||||
|
s.contains(item, prereleases=prereleases)
|
||||||
|
for s in self._specs
|
||||||
|
)
|
||||||
|
|
||||||
|
def filter(self, iterable, prereleases=None):
|
||||||
|
# Determine if we're forcing a prerelease or not, if we're not forcing
|
||||||
|
# one for this particular filter call, then we'll use whatever the
|
||||||
|
# SpecifierSet thinks for whether or not we should support prereleases.
|
||||||
|
if prereleases is None:
|
||||||
|
prereleases = self.prereleases
|
||||||
|
|
||||||
|
# If we have any specifiers, then we want to wrap our iterable in the
|
||||||
|
# filter method for each one, this will act as a logical AND amongst
|
||||||
|
# each specifier.
|
||||||
|
if self._specs:
|
||||||
|
for spec in self._specs:
|
||||||
|
iterable = spec.filter(iterable, prereleases=bool(prereleases))
|
||||||
|
return iterable
|
||||||
|
# If we do not have any specifiers, then we need to have a rough filter
|
||||||
|
# which will filter out any pre-releases, unless there are no final
|
||||||
|
# releases, and which will filter out LegacyVersion in general.
|
||||||
|
else:
|
||||||
|
filtered = []
|
||||||
|
found_prereleases = []
|
||||||
|
|
||||||
|
for item in iterable:
|
||||||
|
# Ensure that we some kind of Version class for this item.
|
||||||
|
if not isinstance(item, (LegacyVersion, Version)):
|
||||||
|
parsed_version = parse(item)
|
||||||
|
else:
|
||||||
|
parsed_version = item
|
||||||
|
|
||||||
|
# Filter out any item which is parsed as a LegacyVersion
|
||||||
|
if isinstance(parsed_version, LegacyVersion):
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Store any item which is a pre-release for later unless we've
|
||||||
|
# already found a final version or we are accepting prereleases
|
||||||
|
if parsed_version.is_prerelease and not prereleases:
|
||||||
|
if not filtered:
|
||||||
|
found_prereleases.append(item)
|
||||||
|
else:
|
||||||
|
filtered.append(item)
|
||||||
|
|
||||||
|
# If we've found no items except for pre-releases, then we'll go
|
||||||
|
# ahead and use the pre-releases
|
||||||
|
if not filtered and found_prereleases and prereleases is None:
|
||||||
|
return found_prereleases
|
||||||
|
|
||||||
|
return filtered
|
||||||
@@ -0,0 +1,14 @@
|
|||||||
|
# This file is dual licensed under the terms of the Apache License, Version
|
||||||
|
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
|
||||||
|
# for complete details.
|
||||||
|
from __future__ import absolute_import, division, print_function
|
||||||
|
|
||||||
|
import re
|
||||||
|
|
||||||
|
|
||||||
|
_canonicalize_regex = re.compile(r"[-_.]+")
|
||||||
|
|
||||||
|
|
||||||
|
def canonicalize_name(name):
|
||||||
|
# This is taken from PEP 503.
|
||||||
|
return _canonicalize_regex.sub("-", name).lower()
|
||||||
@@ -0,0 +1,393 @@
|
|||||||
|
# This file is dual licensed under the terms of the Apache License, Version
|
||||||
|
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
|
||||||
|
# for complete details.
|
||||||
|
from __future__ import absolute_import, division, print_function
|
||||||
|
|
||||||
|
import collections
|
||||||
|
import itertools
|
||||||
|
import re
|
||||||
|
|
||||||
|
from ._structures import Infinity
|
||||||
|
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
"parse", "Version", "LegacyVersion", "InvalidVersion", "VERSION_PATTERN"
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
_Version = collections.namedtuple(
|
||||||
|
"_Version",
|
||||||
|
["epoch", "release", "dev", "pre", "post", "local"],
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def parse(version):
|
||||||
|
"""
|
||||||
|
Parse the given version string and return either a :class:`Version` object
|
||||||
|
or a :class:`LegacyVersion` object depending on if the given version is
|
||||||
|
a valid PEP 440 version or a legacy version.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
return Version(version)
|
||||||
|
except InvalidVersion:
|
||||||
|
return LegacyVersion(version)
|
||||||
|
|
||||||
|
|
||||||
|
class InvalidVersion(ValueError):
|
||||||
|
"""
|
||||||
|
An invalid version was found, users should refer to PEP 440.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class _BaseVersion(object):
|
||||||
|
|
||||||
|
def __hash__(self):
|
||||||
|
return hash(self._key)
|
||||||
|
|
||||||
|
def __lt__(self, other):
|
||||||
|
return self._compare(other, lambda s, o: s < o)
|
||||||
|
|
||||||
|
def __le__(self, other):
|
||||||
|
return self._compare(other, lambda s, o: s <= o)
|
||||||
|
|
||||||
|
def __eq__(self, other):
|
||||||
|
return self._compare(other, lambda s, o: s == o)
|
||||||
|
|
||||||
|
def __ge__(self, other):
|
||||||
|
return self._compare(other, lambda s, o: s >= o)
|
||||||
|
|
||||||
|
def __gt__(self, other):
|
||||||
|
return self._compare(other, lambda s, o: s > o)
|
||||||
|
|
||||||
|
def __ne__(self, other):
|
||||||
|
return self._compare(other, lambda s, o: s != o)
|
||||||
|
|
||||||
|
def _compare(self, other, method):
|
||||||
|
if not isinstance(other, _BaseVersion):
|
||||||
|
return NotImplemented
|
||||||
|
|
||||||
|
return method(self._key, other._key)
|
||||||
|
|
||||||
|
|
||||||
|
class LegacyVersion(_BaseVersion):
|
||||||
|
|
||||||
|
def __init__(self, version):
|
||||||
|
self._version = str(version)
|
||||||
|
self._key = _legacy_cmpkey(self._version)
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return self._version
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return "<LegacyVersion({0})>".format(repr(str(self)))
|
||||||
|
|
||||||
|
@property
|
||||||
|
def public(self):
|
||||||
|
return self._version
|
||||||
|
|
||||||
|
@property
|
||||||
|
def base_version(self):
|
||||||
|
return self._version
|
||||||
|
|
||||||
|
@property
|
||||||
|
def local(self):
|
||||||
|
return None
|
||||||
|
|
||||||
|
@property
|
||||||
|
def is_prerelease(self):
|
||||||
|
return False
|
||||||
|
|
||||||
|
@property
|
||||||
|
def is_postrelease(self):
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
_legacy_version_component_re = re.compile(
|
||||||
|
r"(\d+ | [a-z]+ | \.| -)", re.VERBOSE,
|
||||||
|
)
|
||||||
|
|
||||||
|
_legacy_version_replacement_map = {
|
||||||
|
"pre": "c", "preview": "c", "-": "final-", "rc": "c", "dev": "@",
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def _parse_version_parts(s):
|
||||||
|
for part in _legacy_version_component_re.split(s):
|
||||||
|
part = _legacy_version_replacement_map.get(part, part)
|
||||||
|
|
||||||
|
if not part or part == ".":
|
||||||
|
continue
|
||||||
|
|
||||||
|
if part[:1] in "0123456789":
|
||||||
|
# pad for numeric comparison
|
||||||
|
yield part.zfill(8)
|
||||||
|
else:
|
||||||
|
yield "*" + part
|
||||||
|
|
||||||
|
# ensure that alpha/beta/candidate are before final
|
||||||
|
yield "*final"
|
||||||
|
|
||||||
|
|
||||||
|
def _legacy_cmpkey(version):
|
||||||
|
# We hardcode an epoch of -1 here. A PEP 440 version can only have a epoch
|
||||||
|
# greater than or equal to 0. This will effectively put the LegacyVersion,
|
||||||
|
# which uses the defacto standard originally implemented by setuptools,
|
||||||
|
# as before all PEP 440 versions.
|
||||||
|
epoch = -1
|
||||||
|
|
||||||
|
# This scheme is taken from pkg_resources.parse_version setuptools prior to
|
||||||
|
# it's adoption of the packaging library.
|
||||||
|
parts = []
|
||||||
|
for part in _parse_version_parts(version.lower()):
|
||||||
|
if part.startswith("*"):
|
||||||
|
# remove "-" before a prerelease tag
|
||||||
|
if part < "*final":
|
||||||
|
while parts and parts[-1] == "*final-":
|
||||||
|
parts.pop()
|
||||||
|
|
||||||
|
# remove trailing zeros from each series of numeric parts
|
||||||
|
while parts and parts[-1] == "00000000":
|
||||||
|
parts.pop()
|
||||||
|
|
||||||
|
parts.append(part)
|
||||||
|
parts = tuple(parts)
|
||||||
|
|
||||||
|
return epoch, parts
|
||||||
|
|
||||||
|
# Deliberately not anchored to the start and end of the string, to make it
|
||||||
|
# easier for 3rd party code to reuse
|
||||||
|
VERSION_PATTERN = r"""
|
||||||
|
v?
|
||||||
|
(?:
|
||||||
|
(?:(?P<epoch>[0-9]+)!)? # epoch
|
||||||
|
(?P<release>[0-9]+(?:\.[0-9]+)*) # release segment
|
||||||
|
(?P<pre> # pre-release
|
||||||
|
[-_\.]?
|
||||||
|
(?P<pre_l>(a|b|c|rc|alpha|beta|pre|preview))
|
||||||
|
[-_\.]?
|
||||||
|
(?P<pre_n>[0-9]+)?
|
||||||
|
)?
|
||||||
|
(?P<post> # post release
|
||||||
|
(?:-(?P<post_n1>[0-9]+))
|
||||||
|
|
|
||||||
|
(?:
|
||||||
|
[-_\.]?
|
||||||
|
(?P<post_l>post|rev|r)
|
||||||
|
[-_\.]?
|
||||||
|
(?P<post_n2>[0-9]+)?
|
||||||
|
)
|
||||||
|
)?
|
||||||
|
(?P<dev> # dev release
|
||||||
|
[-_\.]?
|
||||||
|
(?P<dev_l>dev)
|
||||||
|
[-_\.]?
|
||||||
|
(?P<dev_n>[0-9]+)?
|
||||||
|
)?
|
||||||
|
)
|
||||||
|
(?:\+(?P<local>[a-z0-9]+(?:[-_\.][a-z0-9]+)*))? # local version
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class Version(_BaseVersion):
|
||||||
|
|
||||||
|
_regex = re.compile(
|
||||||
|
r"^\s*" + VERSION_PATTERN + r"\s*$",
|
||||||
|
re.VERBOSE | re.IGNORECASE,
|
||||||
|
)
|
||||||
|
|
||||||
|
def __init__(self, version):
|
||||||
|
# Validate the version and parse it into pieces
|
||||||
|
match = self._regex.search(version)
|
||||||
|
if not match:
|
||||||
|
raise InvalidVersion("Invalid version: '{0}'".format(version))
|
||||||
|
|
||||||
|
# Store the parsed out pieces of the version
|
||||||
|
self._version = _Version(
|
||||||
|
epoch=int(match.group("epoch")) if match.group("epoch") else 0,
|
||||||
|
release=tuple(int(i) for i in match.group("release").split(".")),
|
||||||
|
pre=_parse_letter_version(
|
||||||
|
match.group("pre_l"),
|
||||||
|
match.group("pre_n"),
|
||||||
|
),
|
||||||
|
post=_parse_letter_version(
|
||||||
|
match.group("post_l"),
|
||||||
|
match.group("post_n1") or match.group("post_n2"),
|
||||||
|
),
|
||||||
|
dev=_parse_letter_version(
|
||||||
|
match.group("dev_l"),
|
||||||
|
match.group("dev_n"),
|
||||||
|
),
|
||||||
|
local=_parse_local_version(match.group("local")),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Generate a key which will be used for sorting
|
||||||
|
self._key = _cmpkey(
|
||||||
|
self._version.epoch,
|
||||||
|
self._version.release,
|
||||||
|
self._version.pre,
|
||||||
|
self._version.post,
|
||||||
|
self._version.dev,
|
||||||
|
self._version.local,
|
||||||
|
)
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return "<Version({0})>".format(repr(str(self)))
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
parts = []
|
||||||
|
|
||||||
|
# Epoch
|
||||||
|
if self._version.epoch != 0:
|
||||||
|
parts.append("{0}!".format(self._version.epoch))
|
||||||
|
|
||||||
|
# Release segment
|
||||||
|
parts.append(".".join(str(x) for x in self._version.release))
|
||||||
|
|
||||||
|
# Pre-release
|
||||||
|
if self._version.pre is not None:
|
||||||
|
parts.append("".join(str(x) for x in self._version.pre))
|
||||||
|
|
||||||
|
# Post-release
|
||||||
|
if self._version.post is not None:
|
||||||
|
parts.append(".post{0}".format(self._version.post[1]))
|
||||||
|
|
||||||
|
# Development release
|
||||||
|
if self._version.dev is not None:
|
||||||
|
parts.append(".dev{0}".format(self._version.dev[1]))
|
||||||
|
|
||||||
|
# Local version segment
|
||||||
|
if self._version.local is not None:
|
||||||
|
parts.append(
|
||||||
|
"+{0}".format(".".join(str(x) for x in self._version.local))
|
||||||
|
)
|
||||||
|
|
||||||
|
return "".join(parts)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def public(self):
|
||||||
|
return str(self).split("+", 1)[0]
|
||||||
|
|
||||||
|
@property
|
||||||
|
def base_version(self):
|
||||||
|
parts = []
|
||||||
|
|
||||||
|
# Epoch
|
||||||
|
if self._version.epoch != 0:
|
||||||
|
parts.append("{0}!".format(self._version.epoch))
|
||||||
|
|
||||||
|
# Release segment
|
||||||
|
parts.append(".".join(str(x) for x in self._version.release))
|
||||||
|
|
||||||
|
return "".join(parts)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def local(self):
|
||||||
|
version_string = str(self)
|
||||||
|
if "+" in version_string:
|
||||||
|
return version_string.split("+", 1)[1]
|
||||||
|
|
||||||
|
@property
|
||||||
|
def is_prerelease(self):
|
||||||
|
return bool(self._version.dev or self._version.pre)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def is_postrelease(self):
|
||||||
|
return bool(self._version.post)
|
||||||
|
|
||||||
|
|
||||||
|
def _parse_letter_version(letter, number):
|
||||||
|
if letter:
|
||||||
|
# We consider there to be an implicit 0 in a pre-release if there is
|
||||||
|
# not a numeral associated with it.
|
||||||
|
if number is None:
|
||||||
|
number = 0
|
||||||
|
|
||||||
|
# We normalize any letters to their lower case form
|
||||||
|
letter = letter.lower()
|
||||||
|
|
||||||
|
# We consider some words to be alternate spellings of other words and
|
||||||
|
# in those cases we want to normalize the spellings to our preferred
|
||||||
|
# spelling.
|
||||||
|
if letter == "alpha":
|
||||||
|
letter = "a"
|
||||||
|
elif letter == "beta":
|
||||||
|
letter = "b"
|
||||||
|
elif letter in ["c", "pre", "preview"]:
|
||||||
|
letter = "rc"
|
||||||
|
elif letter in ["rev", "r"]:
|
||||||
|
letter = "post"
|
||||||
|
|
||||||
|
return letter, int(number)
|
||||||
|
if not letter and number:
|
||||||
|
# We assume if we are given a number, but we are not given a letter
|
||||||
|
# then this is using the implicit post release syntax (e.g. 1.0-1)
|
||||||
|
letter = "post"
|
||||||
|
|
||||||
|
return letter, int(number)
|
||||||
|
|
||||||
|
|
||||||
|
_local_version_seperators = re.compile(r"[\._-]")
|
||||||
|
|
||||||
|
|
||||||
|
def _parse_local_version(local):
|
||||||
|
"""
|
||||||
|
Takes a string like abc.1.twelve and turns it into ("abc", 1, "twelve").
|
||||||
|
"""
|
||||||
|
if local is not None:
|
||||||
|
return tuple(
|
||||||
|
part.lower() if not part.isdigit() else int(part)
|
||||||
|
for part in _local_version_seperators.split(local)
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _cmpkey(epoch, release, pre, post, dev, local):
|
||||||
|
# When we compare a release version, we want to compare it with all of the
|
||||||
|
# trailing zeros removed. So we'll use a reverse the list, drop all the now
|
||||||
|
# leading zeros until we come to something non zero, then take the rest
|
||||||
|
# re-reverse it back into the correct order and make it a tuple and use
|
||||||
|
# that for our sorting key.
|
||||||
|
release = tuple(
|
||||||
|
reversed(list(
|
||||||
|
itertools.dropwhile(
|
||||||
|
lambda x: x == 0,
|
||||||
|
reversed(release),
|
||||||
|
)
|
||||||
|
))
|
||||||
|
)
|
||||||
|
|
||||||
|
# We need to "trick" the sorting algorithm to put 1.0.dev0 before 1.0a0.
|
||||||
|
# We'll do this by abusing the pre segment, but we _only_ want to do this
|
||||||
|
# if there is not a pre or a post segment. If we have one of those then
|
||||||
|
# the normal sorting rules will handle this case correctly.
|
||||||
|
if pre is None and post is None and dev is not None:
|
||||||
|
pre = -Infinity
|
||||||
|
# Versions without a pre-release (except as noted above) should sort after
|
||||||
|
# those with one.
|
||||||
|
elif pre is None:
|
||||||
|
pre = Infinity
|
||||||
|
|
||||||
|
# Versions without a post segment should sort before those with one.
|
||||||
|
if post is None:
|
||||||
|
post = -Infinity
|
||||||
|
|
||||||
|
# Versions without a development segment should sort after those with one.
|
||||||
|
if dev is None:
|
||||||
|
dev = Infinity
|
||||||
|
|
||||||
|
if local is None:
|
||||||
|
# Versions without a local segment should sort before those with one.
|
||||||
|
local = -Infinity
|
||||||
|
else:
|
||||||
|
# Versions with a local segment need that segment parsed to implement
|
||||||
|
# the sorting rules in PEP440.
|
||||||
|
# - Alpha numeric segments sort before numeric segments
|
||||||
|
# - Alpha numeric segments sort lexicographically
|
||||||
|
# - Numeric segments sort numerically
|
||||||
|
# - Shorter versions sort before longer versions when the prefixes
|
||||||
|
# match exactly
|
||||||
|
local = tuple(
|
||||||
|
(i, "") if isinstance(i, int) else (-Infinity, i)
|
||||||
|
for i in local
|
||||||
|
)
|
||||||
|
|
||||||
|
return epoch, release, pre, post, dev, local
|
||||||
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,868 @@
|
|||||||
|
"""Utilities for writing code that runs on Python 2 and 3"""
|
||||||
|
|
||||||
|
# Copyright (c) 2010-2015 Benjamin Peterson
|
||||||
|
#
|
||||||
|
# Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
# of this software and associated documentation files (the "Software"), to deal
|
||||||
|
# in the Software without restriction, including without limitation the rights
|
||||||
|
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
# copies of the Software, and to permit persons to whom the Software is
|
||||||
|
# furnished to do so, subject to the following conditions:
|
||||||
|
#
|
||||||
|
# The above copyright notice and this permission notice shall be included in all
|
||||||
|
# copies or substantial portions of the Software.
|
||||||
|
#
|
||||||
|
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||||
|
# SOFTWARE.
|
||||||
|
|
||||||
|
from __future__ import absolute_import
|
||||||
|
|
||||||
|
import functools
|
||||||
|
import itertools
|
||||||
|
import operator
|
||||||
|
import sys
|
||||||
|
import types
|
||||||
|
|
||||||
|
__author__ = "Benjamin Peterson <benjamin@python.org>"
|
||||||
|
__version__ = "1.10.0"
|
||||||
|
|
||||||
|
|
||||||
|
# Useful for very coarse version differentiation.
|
||||||
|
PY2 = sys.version_info[0] == 2
|
||||||
|
PY3 = sys.version_info[0] == 3
|
||||||
|
PY34 = sys.version_info[0:2] >= (3, 4)
|
||||||
|
|
||||||
|
if PY3:
|
||||||
|
string_types = str,
|
||||||
|
integer_types = int,
|
||||||
|
class_types = type,
|
||||||
|
text_type = str
|
||||||
|
binary_type = bytes
|
||||||
|
|
||||||
|
MAXSIZE = sys.maxsize
|
||||||
|
else:
|
||||||
|
string_types = basestring,
|
||||||
|
integer_types = (int, long)
|
||||||
|
class_types = (type, types.ClassType)
|
||||||
|
text_type = unicode
|
||||||
|
binary_type = str
|
||||||
|
|
||||||
|
if sys.platform.startswith("java"):
|
||||||
|
# Jython always uses 32 bits.
|
||||||
|
MAXSIZE = int((1 << 31) - 1)
|
||||||
|
else:
|
||||||
|
# It's possible to have sizeof(long) != sizeof(Py_ssize_t).
|
||||||
|
class X(object):
|
||||||
|
|
||||||
|
def __len__(self):
|
||||||
|
return 1 << 31
|
||||||
|
try:
|
||||||
|
len(X())
|
||||||
|
except OverflowError:
|
||||||
|
# 32-bit
|
||||||
|
MAXSIZE = int((1 << 31) - 1)
|
||||||
|
else:
|
||||||
|
# 64-bit
|
||||||
|
MAXSIZE = int((1 << 63) - 1)
|
||||||
|
del X
|
||||||
|
|
||||||
|
|
||||||
|
def _add_doc(func, doc):
|
||||||
|
"""Add documentation to a function."""
|
||||||
|
func.__doc__ = doc
|
||||||
|
|
||||||
|
|
||||||
|
def _import_module(name):
|
||||||
|
"""Import module, returning the module after the last dot."""
|
||||||
|
__import__(name)
|
||||||
|
return sys.modules[name]
|
||||||
|
|
||||||
|
|
||||||
|
class _LazyDescr(object):
|
||||||
|
|
||||||
|
def __init__(self, name):
|
||||||
|
self.name = name
|
||||||
|
|
||||||
|
def __get__(self, obj, tp):
|
||||||
|
result = self._resolve()
|
||||||
|
setattr(obj, self.name, result) # Invokes __set__.
|
||||||
|
try:
|
||||||
|
# This is a bit ugly, but it avoids running this again by
|
||||||
|
# removing this descriptor.
|
||||||
|
delattr(obj.__class__, self.name)
|
||||||
|
except AttributeError:
|
||||||
|
pass
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
class MovedModule(_LazyDescr):
|
||||||
|
|
||||||
|
def __init__(self, name, old, new=None):
|
||||||
|
super(MovedModule, self).__init__(name)
|
||||||
|
if PY3:
|
||||||
|
if new is None:
|
||||||
|
new = name
|
||||||
|
self.mod = new
|
||||||
|
else:
|
||||||
|
self.mod = old
|
||||||
|
|
||||||
|
def _resolve(self):
|
||||||
|
return _import_module(self.mod)
|
||||||
|
|
||||||
|
def __getattr__(self, attr):
|
||||||
|
_module = self._resolve()
|
||||||
|
value = getattr(_module, attr)
|
||||||
|
setattr(self, attr, value)
|
||||||
|
return value
|
||||||
|
|
||||||
|
|
||||||
|
class _LazyModule(types.ModuleType):
|
||||||
|
|
||||||
|
def __init__(self, name):
|
||||||
|
super(_LazyModule, self).__init__(name)
|
||||||
|
self.__doc__ = self.__class__.__doc__
|
||||||
|
|
||||||
|
def __dir__(self):
|
||||||
|
attrs = ["__doc__", "__name__"]
|
||||||
|
attrs += [attr.name for attr in self._moved_attributes]
|
||||||
|
return attrs
|
||||||
|
|
||||||
|
# Subclasses should override this
|
||||||
|
_moved_attributes = []
|
||||||
|
|
||||||
|
|
||||||
|
class MovedAttribute(_LazyDescr):
|
||||||
|
|
||||||
|
def __init__(self, name, old_mod, new_mod, old_attr=None, new_attr=None):
|
||||||
|
super(MovedAttribute, self).__init__(name)
|
||||||
|
if PY3:
|
||||||
|
if new_mod is None:
|
||||||
|
new_mod = name
|
||||||
|
self.mod = new_mod
|
||||||
|
if new_attr is None:
|
||||||
|
if old_attr is None:
|
||||||
|
new_attr = name
|
||||||
|
else:
|
||||||
|
new_attr = old_attr
|
||||||
|
self.attr = new_attr
|
||||||
|
else:
|
||||||
|
self.mod = old_mod
|
||||||
|
if old_attr is None:
|
||||||
|
old_attr = name
|
||||||
|
self.attr = old_attr
|
||||||
|
|
||||||
|
def _resolve(self):
|
||||||
|
module = _import_module(self.mod)
|
||||||
|
return getattr(module, self.attr)
|
||||||
|
|
||||||
|
|
||||||
|
class _SixMetaPathImporter(object):
|
||||||
|
|
||||||
|
"""
|
||||||
|
A meta path importer to import six.moves and its submodules.
|
||||||
|
|
||||||
|
This class implements a PEP302 finder and loader. It should be compatible
|
||||||
|
with Python 2.5 and all existing versions of Python3
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, six_module_name):
|
||||||
|
self.name = six_module_name
|
||||||
|
self.known_modules = {}
|
||||||
|
|
||||||
|
def _add_module(self, mod, *fullnames):
|
||||||
|
for fullname in fullnames:
|
||||||
|
self.known_modules[self.name + "." + fullname] = mod
|
||||||
|
|
||||||
|
def _get_module(self, fullname):
|
||||||
|
return self.known_modules[self.name + "." + fullname]
|
||||||
|
|
||||||
|
def find_module(self, fullname, path=None):
|
||||||
|
if fullname in self.known_modules:
|
||||||
|
return self
|
||||||
|
return None
|
||||||
|
|
||||||
|
def __get_module(self, fullname):
|
||||||
|
try:
|
||||||
|
return self.known_modules[fullname]
|
||||||
|
except KeyError:
|
||||||
|
raise ImportError("This loader does not know module " + fullname)
|
||||||
|
|
||||||
|
def load_module(self, fullname):
|
||||||
|
try:
|
||||||
|
# in case of a reload
|
||||||
|
return sys.modules[fullname]
|
||||||
|
except KeyError:
|
||||||
|
pass
|
||||||
|
mod = self.__get_module(fullname)
|
||||||
|
if isinstance(mod, MovedModule):
|
||||||
|
mod = mod._resolve()
|
||||||
|
else:
|
||||||
|
mod.__loader__ = self
|
||||||
|
sys.modules[fullname] = mod
|
||||||
|
return mod
|
||||||
|
|
||||||
|
def is_package(self, fullname):
|
||||||
|
"""
|
||||||
|
Return true, if the named module is a package.
|
||||||
|
|
||||||
|
We need this method to get correct spec objects with
|
||||||
|
Python 3.4 (see PEP451)
|
||||||
|
"""
|
||||||
|
return hasattr(self.__get_module(fullname), "__path__")
|
||||||
|
|
||||||
|
def get_code(self, fullname):
|
||||||
|
"""Return None
|
||||||
|
|
||||||
|
Required, if is_package is implemented"""
|
||||||
|
self.__get_module(fullname) # eventually raises ImportError
|
||||||
|
return None
|
||||||
|
get_source = get_code # same as get_code
|
||||||
|
|
||||||
|
_importer = _SixMetaPathImporter(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class _MovedItems(_LazyModule):
|
||||||
|
|
||||||
|
"""Lazy loading of moved objects"""
|
||||||
|
__path__ = [] # mark as package
|
||||||
|
|
||||||
|
|
||||||
|
_moved_attributes = [
|
||||||
|
MovedAttribute("cStringIO", "cStringIO", "io", "StringIO"),
|
||||||
|
MovedAttribute("filter", "itertools", "builtins", "ifilter", "filter"),
|
||||||
|
MovedAttribute("filterfalse", "itertools", "itertools", "ifilterfalse", "filterfalse"),
|
||||||
|
MovedAttribute("input", "__builtin__", "builtins", "raw_input", "input"),
|
||||||
|
MovedAttribute("intern", "__builtin__", "sys"),
|
||||||
|
MovedAttribute("map", "itertools", "builtins", "imap", "map"),
|
||||||
|
MovedAttribute("getcwd", "os", "os", "getcwdu", "getcwd"),
|
||||||
|
MovedAttribute("getcwdb", "os", "os", "getcwd", "getcwdb"),
|
||||||
|
MovedAttribute("range", "__builtin__", "builtins", "xrange", "range"),
|
||||||
|
MovedAttribute("reload_module", "__builtin__", "importlib" if PY34 else "imp", "reload"),
|
||||||
|
MovedAttribute("reduce", "__builtin__", "functools"),
|
||||||
|
MovedAttribute("shlex_quote", "pipes", "shlex", "quote"),
|
||||||
|
MovedAttribute("StringIO", "StringIO", "io"),
|
||||||
|
MovedAttribute("UserDict", "UserDict", "collections"),
|
||||||
|
MovedAttribute("UserList", "UserList", "collections"),
|
||||||
|
MovedAttribute("UserString", "UserString", "collections"),
|
||||||
|
MovedAttribute("xrange", "__builtin__", "builtins", "xrange", "range"),
|
||||||
|
MovedAttribute("zip", "itertools", "builtins", "izip", "zip"),
|
||||||
|
MovedAttribute("zip_longest", "itertools", "itertools", "izip_longest", "zip_longest"),
|
||||||
|
MovedModule("builtins", "__builtin__"),
|
||||||
|
MovedModule("configparser", "ConfigParser"),
|
||||||
|
MovedModule("copyreg", "copy_reg"),
|
||||||
|
MovedModule("dbm_gnu", "gdbm", "dbm.gnu"),
|
||||||
|
MovedModule("_dummy_thread", "dummy_thread", "_dummy_thread"),
|
||||||
|
MovedModule("http_cookiejar", "cookielib", "http.cookiejar"),
|
||||||
|
MovedModule("http_cookies", "Cookie", "http.cookies"),
|
||||||
|
MovedModule("html_entities", "htmlentitydefs", "html.entities"),
|
||||||
|
MovedModule("html_parser", "HTMLParser", "html.parser"),
|
||||||
|
MovedModule("http_client", "httplib", "http.client"),
|
||||||
|
MovedModule("email_mime_multipart", "email.MIMEMultipart", "email.mime.multipart"),
|
||||||
|
MovedModule("email_mime_nonmultipart", "email.MIMENonMultipart", "email.mime.nonmultipart"),
|
||||||
|
MovedModule("email_mime_text", "email.MIMEText", "email.mime.text"),
|
||||||
|
MovedModule("email_mime_base", "email.MIMEBase", "email.mime.base"),
|
||||||
|
MovedModule("BaseHTTPServer", "BaseHTTPServer", "http.server"),
|
||||||
|
MovedModule("CGIHTTPServer", "CGIHTTPServer", "http.server"),
|
||||||
|
MovedModule("SimpleHTTPServer", "SimpleHTTPServer", "http.server"),
|
||||||
|
MovedModule("cPickle", "cPickle", "pickle"),
|
||||||
|
MovedModule("queue", "Queue"),
|
||||||
|
MovedModule("reprlib", "repr"),
|
||||||
|
MovedModule("socketserver", "SocketServer"),
|
||||||
|
MovedModule("_thread", "thread", "_thread"),
|
||||||
|
MovedModule("tkinter", "Tkinter"),
|
||||||
|
MovedModule("tkinter_dialog", "Dialog", "tkinter.dialog"),
|
||||||
|
MovedModule("tkinter_filedialog", "FileDialog", "tkinter.filedialog"),
|
||||||
|
MovedModule("tkinter_scrolledtext", "ScrolledText", "tkinter.scrolledtext"),
|
||||||
|
MovedModule("tkinter_simpledialog", "SimpleDialog", "tkinter.simpledialog"),
|
||||||
|
MovedModule("tkinter_tix", "Tix", "tkinter.tix"),
|
||||||
|
MovedModule("tkinter_ttk", "ttk", "tkinter.ttk"),
|
||||||
|
MovedModule("tkinter_constants", "Tkconstants", "tkinter.constants"),
|
||||||
|
MovedModule("tkinter_dnd", "Tkdnd", "tkinter.dnd"),
|
||||||
|
MovedModule("tkinter_colorchooser", "tkColorChooser",
|
||||||
|
"tkinter.colorchooser"),
|
||||||
|
MovedModule("tkinter_commondialog", "tkCommonDialog",
|
||||||
|
"tkinter.commondialog"),
|
||||||
|
MovedModule("tkinter_tkfiledialog", "tkFileDialog", "tkinter.filedialog"),
|
||||||
|
MovedModule("tkinter_font", "tkFont", "tkinter.font"),
|
||||||
|
MovedModule("tkinter_messagebox", "tkMessageBox", "tkinter.messagebox"),
|
||||||
|
MovedModule("tkinter_tksimpledialog", "tkSimpleDialog",
|
||||||
|
"tkinter.simpledialog"),
|
||||||
|
MovedModule("urllib_parse", __name__ + ".moves.urllib_parse", "urllib.parse"),
|
||||||
|
MovedModule("urllib_error", __name__ + ".moves.urllib_error", "urllib.error"),
|
||||||
|
MovedModule("urllib", __name__ + ".moves.urllib", __name__ + ".moves.urllib"),
|
||||||
|
MovedModule("urllib_robotparser", "robotparser", "urllib.robotparser"),
|
||||||
|
MovedModule("xmlrpc_client", "xmlrpclib", "xmlrpc.client"),
|
||||||
|
MovedModule("xmlrpc_server", "SimpleXMLRPCServer", "xmlrpc.server"),
|
||||||
|
]
|
||||||
|
# Add windows specific modules.
|
||||||
|
if sys.platform == "win32":
|
||||||
|
_moved_attributes += [
|
||||||
|
MovedModule("winreg", "_winreg"),
|
||||||
|
]
|
||||||
|
|
||||||
|
for attr in _moved_attributes:
|
||||||
|
setattr(_MovedItems, attr.name, attr)
|
||||||
|
if isinstance(attr, MovedModule):
|
||||||
|
_importer._add_module(attr, "moves." + attr.name)
|
||||||
|
del attr
|
||||||
|
|
||||||
|
_MovedItems._moved_attributes = _moved_attributes
|
||||||
|
|
||||||
|
moves = _MovedItems(__name__ + ".moves")
|
||||||
|
_importer._add_module(moves, "moves")
|
||||||
|
|
||||||
|
|
||||||
|
class Module_six_moves_urllib_parse(_LazyModule):
|
||||||
|
|
||||||
|
"""Lazy loading of moved objects in six.moves.urllib_parse"""
|
||||||
|
|
||||||
|
|
||||||
|
_urllib_parse_moved_attributes = [
|
||||||
|
MovedAttribute("ParseResult", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("SplitResult", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("parse_qs", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("parse_qsl", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("urldefrag", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("urljoin", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("urlparse", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("urlsplit", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("urlunparse", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("urlunsplit", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("quote", "urllib", "urllib.parse"),
|
||||||
|
MovedAttribute("quote_plus", "urllib", "urllib.parse"),
|
||||||
|
MovedAttribute("unquote", "urllib", "urllib.parse"),
|
||||||
|
MovedAttribute("unquote_plus", "urllib", "urllib.parse"),
|
||||||
|
MovedAttribute("urlencode", "urllib", "urllib.parse"),
|
||||||
|
MovedAttribute("splitquery", "urllib", "urllib.parse"),
|
||||||
|
MovedAttribute("splittag", "urllib", "urllib.parse"),
|
||||||
|
MovedAttribute("splituser", "urllib", "urllib.parse"),
|
||||||
|
MovedAttribute("uses_fragment", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("uses_netloc", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("uses_params", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("uses_query", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("uses_relative", "urlparse", "urllib.parse"),
|
||||||
|
]
|
||||||
|
for attr in _urllib_parse_moved_attributes:
|
||||||
|
setattr(Module_six_moves_urllib_parse, attr.name, attr)
|
||||||
|
del attr
|
||||||
|
|
||||||
|
Module_six_moves_urllib_parse._moved_attributes = _urllib_parse_moved_attributes
|
||||||
|
|
||||||
|
_importer._add_module(Module_six_moves_urllib_parse(__name__ + ".moves.urllib_parse"),
|
||||||
|
"moves.urllib_parse", "moves.urllib.parse")
|
||||||
|
|
||||||
|
|
||||||
|
class Module_six_moves_urllib_error(_LazyModule):
|
||||||
|
|
||||||
|
"""Lazy loading of moved objects in six.moves.urllib_error"""
|
||||||
|
|
||||||
|
|
||||||
|
_urllib_error_moved_attributes = [
|
||||||
|
MovedAttribute("URLError", "urllib2", "urllib.error"),
|
||||||
|
MovedAttribute("HTTPError", "urllib2", "urllib.error"),
|
||||||
|
MovedAttribute("ContentTooShortError", "urllib", "urllib.error"),
|
||||||
|
]
|
||||||
|
for attr in _urllib_error_moved_attributes:
|
||||||
|
setattr(Module_six_moves_urllib_error, attr.name, attr)
|
||||||
|
del attr
|
||||||
|
|
||||||
|
Module_six_moves_urllib_error._moved_attributes = _urllib_error_moved_attributes
|
||||||
|
|
||||||
|
_importer._add_module(Module_six_moves_urllib_error(__name__ + ".moves.urllib.error"),
|
||||||
|
"moves.urllib_error", "moves.urllib.error")
|
||||||
|
|
||||||
|
|
||||||
|
class Module_six_moves_urllib_request(_LazyModule):
|
||||||
|
|
||||||
|
"""Lazy loading of moved objects in six.moves.urllib_request"""
|
||||||
|
|
||||||
|
|
||||||
|
_urllib_request_moved_attributes = [
|
||||||
|
MovedAttribute("urlopen", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("install_opener", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("build_opener", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("pathname2url", "urllib", "urllib.request"),
|
||||||
|
MovedAttribute("url2pathname", "urllib", "urllib.request"),
|
||||||
|
MovedAttribute("getproxies", "urllib", "urllib.request"),
|
||||||
|
MovedAttribute("Request", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("OpenerDirector", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPDefaultErrorHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPRedirectHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPCookieProcessor", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("ProxyHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("BaseHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPPasswordMgr", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPPasswordMgrWithDefaultRealm", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("AbstractBasicAuthHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPBasicAuthHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("ProxyBasicAuthHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("AbstractDigestAuthHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPDigestAuthHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("ProxyDigestAuthHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPSHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("FileHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("FTPHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("CacheFTPHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("UnknownHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPErrorProcessor", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("urlretrieve", "urllib", "urllib.request"),
|
||||||
|
MovedAttribute("urlcleanup", "urllib", "urllib.request"),
|
||||||
|
MovedAttribute("URLopener", "urllib", "urllib.request"),
|
||||||
|
MovedAttribute("FancyURLopener", "urllib", "urllib.request"),
|
||||||
|
MovedAttribute("proxy_bypass", "urllib", "urllib.request"),
|
||||||
|
]
|
||||||
|
for attr in _urllib_request_moved_attributes:
|
||||||
|
setattr(Module_six_moves_urllib_request, attr.name, attr)
|
||||||
|
del attr
|
||||||
|
|
||||||
|
Module_six_moves_urllib_request._moved_attributes = _urllib_request_moved_attributes
|
||||||
|
|
||||||
|
_importer._add_module(Module_six_moves_urllib_request(__name__ + ".moves.urllib.request"),
|
||||||
|
"moves.urllib_request", "moves.urllib.request")
|
||||||
|
|
||||||
|
|
||||||
|
class Module_six_moves_urllib_response(_LazyModule):
|
||||||
|
|
||||||
|
"""Lazy loading of moved objects in six.moves.urllib_response"""
|
||||||
|
|
||||||
|
|
||||||
|
_urllib_response_moved_attributes = [
|
||||||
|
MovedAttribute("addbase", "urllib", "urllib.response"),
|
||||||
|
MovedAttribute("addclosehook", "urllib", "urllib.response"),
|
||||||
|
MovedAttribute("addinfo", "urllib", "urllib.response"),
|
||||||
|
MovedAttribute("addinfourl", "urllib", "urllib.response"),
|
||||||
|
]
|
||||||
|
for attr in _urllib_response_moved_attributes:
|
||||||
|
setattr(Module_six_moves_urllib_response, attr.name, attr)
|
||||||
|
del attr
|
||||||
|
|
||||||
|
Module_six_moves_urllib_response._moved_attributes = _urllib_response_moved_attributes
|
||||||
|
|
||||||
|
_importer._add_module(Module_six_moves_urllib_response(__name__ + ".moves.urllib.response"),
|
||||||
|
"moves.urllib_response", "moves.urllib.response")
|
||||||
|
|
||||||
|
|
||||||
|
class Module_six_moves_urllib_robotparser(_LazyModule):
|
||||||
|
|
||||||
|
"""Lazy loading of moved objects in six.moves.urllib_robotparser"""
|
||||||
|
|
||||||
|
|
||||||
|
_urllib_robotparser_moved_attributes = [
|
||||||
|
MovedAttribute("RobotFileParser", "robotparser", "urllib.robotparser"),
|
||||||
|
]
|
||||||
|
for attr in _urllib_robotparser_moved_attributes:
|
||||||
|
setattr(Module_six_moves_urllib_robotparser, attr.name, attr)
|
||||||
|
del attr
|
||||||
|
|
||||||
|
Module_six_moves_urllib_robotparser._moved_attributes = _urllib_robotparser_moved_attributes
|
||||||
|
|
||||||
|
_importer._add_module(Module_six_moves_urllib_robotparser(__name__ + ".moves.urllib.robotparser"),
|
||||||
|
"moves.urllib_robotparser", "moves.urllib.robotparser")
|
||||||
|
|
||||||
|
|
||||||
|
class Module_six_moves_urllib(types.ModuleType):
|
||||||
|
|
||||||
|
"""Create a six.moves.urllib namespace that resembles the Python 3 namespace"""
|
||||||
|
__path__ = [] # mark as package
|
||||||
|
parse = _importer._get_module("moves.urllib_parse")
|
||||||
|
error = _importer._get_module("moves.urllib_error")
|
||||||
|
request = _importer._get_module("moves.urllib_request")
|
||||||
|
response = _importer._get_module("moves.urllib_response")
|
||||||
|
robotparser = _importer._get_module("moves.urllib_robotparser")
|
||||||
|
|
||||||
|
def __dir__(self):
|
||||||
|
return ['parse', 'error', 'request', 'response', 'robotparser']
|
||||||
|
|
||||||
|
_importer._add_module(Module_six_moves_urllib(__name__ + ".moves.urllib"),
|
||||||
|
"moves.urllib")
|
||||||
|
|
||||||
|
|
||||||
|
def add_move(move):
|
||||||
|
"""Add an item to six.moves."""
|
||||||
|
setattr(_MovedItems, move.name, move)
|
||||||
|
|
||||||
|
|
||||||
|
def remove_move(name):
|
||||||
|
"""Remove item from six.moves."""
|
||||||
|
try:
|
||||||
|
delattr(_MovedItems, name)
|
||||||
|
except AttributeError:
|
||||||
|
try:
|
||||||
|
del moves.__dict__[name]
|
||||||
|
except KeyError:
|
||||||
|
raise AttributeError("no such move, %r" % (name,))
|
||||||
|
|
||||||
|
|
||||||
|
if PY3:
|
||||||
|
_meth_func = "__func__"
|
||||||
|
_meth_self = "__self__"
|
||||||
|
|
||||||
|
_func_closure = "__closure__"
|
||||||
|
_func_code = "__code__"
|
||||||
|
_func_defaults = "__defaults__"
|
||||||
|
_func_globals = "__globals__"
|
||||||
|
else:
|
||||||
|
_meth_func = "im_func"
|
||||||
|
_meth_self = "im_self"
|
||||||
|
|
||||||
|
_func_closure = "func_closure"
|
||||||
|
_func_code = "func_code"
|
||||||
|
_func_defaults = "func_defaults"
|
||||||
|
_func_globals = "func_globals"
|
||||||
|
|
||||||
|
|
||||||
|
try:
|
||||||
|
advance_iterator = next
|
||||||
|
except NameError:
|
||||||
|
def advance_iterator(it):
|
||||||
|
return it.next()
|
||||||
|
next = advance_iterator
|
||||||
|
|
||||||
|
|
||||||
|
try:
|
||||||
|
callable = callable
|
||||||
|
except NameError:
|
||||||
|
def callable(obj):
|
||||||
|
return any("__call__" in klass.__dict__ for klass in type(obj).__mro__)
|
||||||
|
|
||||||
|
|
||||||
|
if PY3:
|
||||||
|
def get_unbound_function(unbound):
|
||||||
|
return unbound
|
||||||
|
|
||||||
|
create_bound_method = types.MethodType
|
||||||
|
|
||||||
|
def create_unbound_method(func, cls):
|
||||||
|
return func
|
||||||
|
|
||||||
|
Iterator = object
|
||||||
|
else:
|
||||||
|
def get_unbound_function(unbound):
|
||||||
|
return unbound.im_func
|
||||||
|
|
||||||
|
def create_bound_method(func, obj):
|
||||||
|
return types.MethodType(func, obj, obj.__class__)
|
||||||
|
|
||||||
|
def create_unbound_method(func, cls):
|
||||||
|
return types.MethodType(func, None, cls)
|
||||||
|
|
||||||
|
class Iterator(object):
|
||||||
|
|
||||||
|
def next(self):
|
||||||
|
return type(self).__next__(self)
|
||||||
|
|
||||||
|
callable = callable
|
||||||
|
_add_doc(get_unbound_function,
|
||||||
|
"""Get the function out of a possibly unbound function""")
|
||||||
|
|
||||||
|
|
||||||
|
get_method_function = operator.attrgetter(_meth_func)
|
||||||
|
get_method_self = operator.attrgetter(_meth_self)
|
||||||
|
get_function_closure = operator.attrgetter(_func_closure)
|
||||||
|
get_function_code = operator.attrgetter(_func_code)
|
||||||
|
get_function_defaults = operator.attrgetter(_func_defaults)
|
||||||
|
get_function_globals = operator.attrgetter(_func_globals)
|
||||||
|
|
||||||
|
|
||||||
|
if PY3:
|
||||||
|
def iterkeys(d, **kw):
|
||||||
|
return iter(d.keys(**kw))
|
||||||
|
|
||||||
|
def itervalues(d, **kw):
|
||||||
|
return iter(d.values(**kw))
|
||||||
|
|
||||||
|
def iteritems(d, **kw):
|
||||||
|
return iter(d.items(**kw))
|
||||||
|
|
||||||
|
def iterlists(d, **kw):
|
||||||
|
return iter(d.lists(**kw))
|
||||||
|
|
||||||
|
viewkeys = operator.methodcaller("keys")
|
||||||
|
|
||||||
|
viewvalues = operator.methodcaller("values")
|
||||||
|
|
||||||
|
viewitems = operator.methodcaller("items")
|
||||||
|
else:
|
||||||
|
def iterkeys(d, **kw):
|
||||||
|
return d.iterkeys(**kw)
|
||||||
|
|
||||||
|
def itervalues(d, **kw):
|
||||||
|
return d.itervalues(**kw)
|
||||||
|
|
||||||
|
def iteritems(d, **kw):
|
||||||
|
return d.iteritems(**kw)
|
||||||
|
|
||||||
|
def iterlists(d, **kw):
|
||||||
|
return d.iterlists(**kw)
|
||||||
|
|
||||||
|
viewkeys = operator.methodcaller("viewkeys")
|
||||||
|
|
||||||
|
viewvalues = operator.methodcaller("viewvalues")
|
||||||
|
|
||||||
|
viewitems = operator.methodcaller("viewitems")
|
||||||
|
|
||||||
|
_add_doc(iterkeys, "Return an iterator over the keys of a dictionary.")
|
||||||
|
_add_doc(itervalues, "Return an iterator over the values of a dictionary.")
|
||||||
|
_add_doc(iteritems,
|
||||||
|
"Return an iterator over the (key, value) pairs of a dictionary.")
|
||||||
|
_add_doc(iterlists,
|
||||||
|
"Return an iterator over the (key, [values]) pairs of a dictionary.")
|
||||||
|
|
||||||
|
|
||||||
|
if PY3:
|
||||||
|
def b(s):
|
||||||
|
return s.encode("latin-1")
|
||||||
|
|
||||||
|
def u(s):
|
||||||
|
return s
|
||||||
|
unichr = chr
|
||||||
|
import struct
|
||||||
|
int2byte = struct.Struct(">B").pack
|
||||||
|
del struct
|
||||||
|
byte2int = operator.itemgetter(0)
|
||||||
|
indexbytes = operator.getitem
|
||||||
|
iterbytes = iter
|
||||||
|
import io
|
||||||
|
StringIO = io.StringIO
|
||||||
|
BytesIO = io.BytesIO
|
||||||
|
_assertCountEqual = "assertCountEqual"
|
||||||
|
if sys.version_info[1] <= 1:
|
||||||
|
_assertRaisesRegex = "assertRaisesRegexp"
|
||||||
|
_assertRegex = "assertRegexpMatches"
|
||||||
|
else:
|
||||||
|
_assertRaisesRegex = "assertRaisesRegex"
|
||||||
|
_assertRegex = "assertRegex"
|
||||||
|
else:
|
||||||
|
def b(s):
|
||||||
|
return s
|
||||||
|
# Workaround for standalone backslash
|
||||||
|
|
||||||
|
def u(s):
|
||||||
|
return unicode(s.replace(r'\\', r'\\\\'), "unicode_escape")
|
||||||
|
unichr = unichr
|
||||||
|
int2byte = chr
|
||||||
|
|
||||||
|
def byte2int(bs):
|
||||||
|
return ord(bs[0])
|
||||||
|
|
||||||
|
def indexbytes(buf, i):
|
||||||
|
return ord(buf[i])
|
||||||
|
iterbytes = functools.partial(itertools.imap, ord)
|
||||||
|
import StringIO
|
||||||
|
StringIO = BytesIO = StringIO.StringIO
|
||||||
|
_assertCountEqual = "assertItemsEqual"
|
||||||
|
_assertRaisesRegex = "assertRaisesRegexp"
|
||||||
|
_assertRegex = "assertRegexpMatches"
|
||||||
|
_add_doc(b, """Byte literal""")
|
||||||
|
_add_doc(u, """Text literal""")
|
||||||
|
|
||||||
|
|
||||||
|
def assertCountEqual(self, *args, **kwargs):
|
||||||
|
return getattr(self, _assertCountEqual)(*args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
def assertRaisesRegex(self, *args, **kwargs):
|
||||||
|
return getattr(self, _assertRaisesRegex)(*args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
def assertRegex(self, *args, **kwargs):
|
||||||
|
return getattr(self, _assertRegex)(*args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
if PY3:
|
||||||
|
exec_ = getattr(moves.builtins, "exec")
|
||||||
|
|
||||||
|
def reraise(tp, value, tb=None):
|
||||||
|
if value is None:
|
||||||
|
value = tp()
|
||||||
|
if value.__traceback__ is not tb:
|
||||||
|
raise value.with_traceback(tb)
|
||||||
|
raise value
|
||||||
|
|
||||||
|
else:
|
||||||
|
def exec_(_code_, _globs_=None, _locs_=None):
|
||||||
|
"""Execute code in a namespace."""
|
||||||
|
if _globs_ is None:
|
||||||
|
frame = sys._getframe(1)
|
||||||
|
_globs_ = frame.f_globals
|
||||||
|
if _locs_ is None:
|
||||||
|
_locs_ = frame.f_locals
|
||||||
|
del frame
|
||||||
|
elif _locs_ is None:
|
||||||
|
_locs_ = _globs_
|
||||||
|
exec("""exec _code_ in _globs_, _locs_""")
|
||||||
|
|
||||||
|
exec_("""def reraise(tp, value, tb=None):
|
||||||
|
raise tp, value, tb
|
||||||
|
""")
|
||||||
|
|
||||||
|
|
||||||
|
if sys.version_info[:2] == (3, 2):
|
||||||
|
exec_("""def raise_from(value, from_value):
|
||||||
|
if from_value is None:
|
||||||
|
raise value
|
||||||
|
raise value from from_value
|
||||||
|
""")
|
||||||
|
elif sys.version_info[:2] > (3, 2):
|
||||||
|
exec_("""def raise_from(value, from_value):
|
||||||
|
raise value from from_value
|
||||||
|
""")
|
||||||
|
else:
|
||||||
|
def raise_from(value, from_value):
|
||||||
|
raise value
|
||||||
|
|
||||||
|
|
||||||
|
print_ = getattr(moves.builtins, "print", None)
|
||||||
|
if print_ is None:
|
||||||
|
def print_(*args, **kwargs):
|
||||||
|
"""The new-style print function for Python 2.4 and 2.5."""
|
||||||
|
fp = kwargs.pop("file", sys.stdout)
|
||||||
|
if fp is None:
|
||||||
|
return
|
||||||
|
|
||||||
|
def write(data):
|
||||||
|
if not isinstance(data, basestring):
|
||||||
|
data = str(data)
|
||||||
|
# If the file has an encoding, encode unicode with it.
|
||||||
|
if (isinstance(fp, file) and
|
||||||
|
isinstance(data, unicode) and
|
||||||
|
fp.encoding is not None):
|
||||||
|
errors = getattr(fp, "errors", None)
|
||||||
|
if errors is None:
|
||||||
|
errors = "strict"
|
||||||
|
data = data.encode(fp.encoding, errors)
|
||||||
|
fp.write(data)
|
||||||
|
want_unicode = False
|
||||||
|
sep = kwargs.pop("sep", None)
|
||||||
|
if sep is not None:
|
||||||
|
if isinstance(sep, unicode):
|
||||||
|
want_unicode = True
|
||||||
|
elif not isinstance(sep, str):
|
||||||
|
raise TypeError("sep must be None or a string")
|
||||||
|
end = kwargs.pop("end", None)
|
||||||
|
if end is not None:
|
||||||
|
if isinstance(end, unicode):
|
||||||
|
want_unicode = True
|
||||||
|
elif not isinstance(end, str):
|
||||||
|
raise TypeError("end must be None or a string")
|
||||||
|
if kwargs:
|
||||||
|
raise TypeError("invalid keyword arguments to print()")
|
||||||
|
if not want_unicode:
|
||||||
|
for arg in args:
|
||||||
|
if isinstance(arg, unicode):
|
||||||
|
want_unicode = True
|
||||||
|
break
|
||||||
|
if want_unicode:
|
||||||
|
newline = unicode("\n")
|
||||||
|
space = unicode(" ")
|
||||||
|
else:
|
||||||
|
newline = "\n"
|
||||||
|
space = " "
|
||||||
|
if sep is None:
|
||||||
|
sep = space
|
||||||
|
if end is None:
|
||||||
|
end = newline
|
||||||
|
for i, arg in enumerate(args):
|
||||||
|
if i:
|
||||||
|
write(sep)
|
||||||
|
write(arg)
|
||||||
|
write(end)
|
||||||
|
if sys.version_info[:2] < (3, 3):
|
||||||
|
_print = print_
|
||||||
|
|
||||||
|
def print_(*args, **kwargs):
|
||||||
|
fp = kwargs.get("file", sys.stdout)
|
||||||
|
flush = kwargs.pop("flush", False)
|
||||||
|
_print(*args, **kwargs)
|
||||||
|
if flush and fp is not None:
|
||||||
|
fp.flush()
|
||||||
|
|
||||||
|
_add_doc(reraise, """Reraise an exception.""")
|
||||||
|
|
||||||
|
if sys.version_info[0:2] < (3, 4):
|
||||||
|
def wraps(wrapped, assigned=functools.WRAPPER_ASSIGNMENTS,
|
||||||
|
updated=functools.WRAPPER_UPDATES):
|
||||||
|
def wrapper(f):
|
||||||
|
f = functools.wraps(wrapped, assigned, updated)(f)
|
||||||
|
f.__wrapped__ = wrapped
|
||||||
|
return f
|
||||||
|
return wrapper
|
||||||
|
else:
|
||||||
|
wraps = functools.wraps
|
||||||
|
|
||||||
|
|
||||||
|
def with_metaclass(meta, *bases):
|
||||||
|
"""Create a base class with a metaclass."""
|
||||||
|
# This requires a bit of explanation: the basic idea is to make a dummy
|
||||||
|
# metaclass for one level of class instantiation that replaces itself with
|
||||||
|
# the actual metaclass.
|
||||||
|
class metaclass(meta):
|
||||||
|
|
||||||
|
def __new__(cls, name, this_bases, d):
|
||||||
|
return meta(name, bases, d)
|
||||||
|
return type.__new__(metaclass, 'temporary_class', (), {})
|
||||||
|
|
||||||
|
|
||||||
|
def add_metaclass(metaclass):
|
||||||
|
"""Class decorator for creating a class with a metaclass."""
|
||||||
|
def wrapper(cls):
|
||||||
|
orig_vars = cls.__dict__.copy()
|
||||||
|
slots = orig_vars.get('__slots__')
|
||||||
|
if slots is not None:
|
||||||
|
if isinstance(slots, str):
|
||||||
|
slots = [slots]
|
||||||
|
for slots_var in slots:
|
||||||
|
orig_vars.pop(slots_var)
|
||||||
|
orig_vars.pop('__dict__', None)
|
||||||
|
orig_vars.pop('__weakref__', None)
|
||||||
|
return metaclass(cls.__name__, cls.__bases__, orig_vars)
|
||||||
|
return wrapper
|
||||||
|
|
||||||
|
|
||||||
|
def python_2_unicode_compatible(klass):
|
||||||
|
"""
|
||||||
|
A decorator that defines __unicode__ and __str__ methods under Python 2.
|
||||||
|
Under Python 3 it does nothing.
|
||||||
|
|
||||||
|
To support Python 2 and 3 with a single code base, define a __str__ method
|
||||||
|
returning text and apply this decorator to the class.
|
||||||
|
"""
|
||||||
|
if PY2:
|
||||||
|
if '__str__' not in klass.__dict__:
|
||||||
|
raise ValueError("@python_2_unicode_compatible cannot be applied "
|
||||||
|
"to %s because it doesn't define __str__()." %
|
||||||
|
klass.__name__)
|
||||||
|
klass.__unicode__ = klass.__str__
|
||||||
|
klass.__str__ = lambda self: self.__unicode__().encode('utf-8')
|
||||||
|
return klass
|
||||||
|
|
||||||
|
|
||||||
|
# Complete the moves implementation.
|
||||||
|
# This code is at the end of this module to speed up module loading.
|
||||||
|
# Turn this module into a package.
|
||||||
|
__path__ = [] # required for PEP 302 and PEP 451
|
||||||
|
__package__ = __name__ # see PEP 366 @ReservedAssignment
|
||||||
|
if globals().get("__spec__") is not None:
|
||||||
|
__spec__.submodule_search_locations = [] # PEP 451 @UndefinedVariable
|
||||||
|
# Remove other six meta path importers, since they cause problems. This can
|
||||||
|
# happen if six is removed from sys.modules and then reloaded. (Setuptools does
|
||||||
|
# this for some reason.)
|
||||||
|
if sys.meta_path:
|
||||||
|
for i, importer in enumerate(sys.meta_path):
|
||||||
|
# Here's some real nastiness: Another "instance" of the six module might
|
||||||
|
# be floating around. Therefore, we can't use isinstance() to check for
|
||||||
|
# the six meta path importer, since the other six instance will have
|
||||||
|
# inserted an importer with different class.
|
||||||
|
if (type(importer).__name__ == "_SixMetaPathImporter" and
|
||||||
|
importer.name == __name__):
|
||||||
|
del sys.meta_path[i]
|
||||||
|
break
|
||||||
|
del i, importer
|
||||||
|
# Finally, add the importer to the meta path import hook.
|
||||||
|
sys.meta_path.append(_importer)
|
||||||
73
Fusion Accounting/i18n/odoo-18-community.venv/Lib/site-packages/pkg_resources/extern/__init__.py
vendored
Normal file
73
Fusion Accounting/i18n/odoo-18-community.venv/Lib/site-packages/pkg_resources/extern/__init__.py
vendored
Normal file
@@ -0,0 +1,73 @@
|
|||||||
|
import sys
|
||||||
|
|
||||||
|
|
||||||
|
class VendorImporter:
|
||||||
|
"""
|
||||||
|
A PEP 302 meta path importer for finding optionally-vendored
|
||||||
|
or otherwise naturally-installed packages from root_name.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, root_name, vendored_names=(), vendor_pkg=None):
|
||||||
|
self.root_name = root_name
|
||||||
|
self.vendored_names = set(vendored_names)
|
||||||
|
self.vendor_pkg = vendor_pkg or root_name.replace('extern', '_vendor')
|
||||||
|
|
||||||
|
@property
|
||||||
|
def search_path(self):
|
||||||
|
"""
|
||||||
|
Search first the vendor package then as a natural package.
|
||||||
|
"""
|
||||||
|
yield self.vendor_pkg + '.'
|
||||||
|
yield ''
|
||||||
|
|
||||||
|
def find_module(self, fullname, path=None):
|
||||||
|
"""
|
||||||
|
Return self when fullname starts with root_name and the
|
||||||
|
target module is one vendored through this importer.
|
||||||
|
"""
|
||||||
|
root, base, target = fullname.partition(self.root_name + '.')
|
||||||
|
if root:
|
||||||
|
return
|
||||||
|
if not any(map(target.startswith, self.vendored_names)):
|
||||||
|
return
|
||||||
|
return self
|
||||||
|
|
||||||
|
def load_module(self, fullname):
|
||||||
|
"""
|
||||||
|
Iterate over the search path to locate and load fullname.
|
||||||
|
"""
|
||||||
|
root, base, target = fullname.partition(self.root_name + '.')
|
||||||
|
for prefix in self.search_path:
|
||||||
|
try:
|
||||||
|
extant = prefix + target
|
||||||
|
__import__(extant)
|
||||||
|
mod = sys.modules[extant]
|
||||||
|
sys.modules[fullname] = mod
|
||||||
|
# mysterious hack:
|
||||||
|
# Remove the reference to the extant package/module
|
||||||
|
# on later Python versions to cause relative imports
|
||||||
|
# in the vendor package to resolve the same modules
|
||||||
|
# as those going through this importer.
|
||||||
|
if sys.version_info > (3, 3):
|
||||||
|
del sys.modules[extant]
|
||||||
|
return mod
|
||||||
|
except ImportError:
|
||||||
|
pass
|
||||||
|
else:
|
||||||
|
raise ImportError(
|
||||||
|
"The '{target}' package is required; "
|
||||||
|
"normally this is bundled with this package so if you get "
|
||||||
|
"this warning, consult the packager of your "
|
||||||
|
"distribution.".format(**locals())
|
||||||
|
)
|
||||||
|
|
||||||
|
def install(self):
|
||||||
|
"""
|
||||||
|
Install this importer into sys.meta_path if not already present.
|
||||||
|
"""
|
||||||
|
if self not in sys.meta_path:
|
||||||
|
sys.meta_path.append(self)
|
||||||
|
|
||||||
|
|
||||||
|
names = 'packaging', 'pyparsing', 'six', 'appdirs'
|
||||||
|
VendorImporter(__name__, names).install()
|
||||||
@@ -0,0 +1,22 @@
|
|||||||
|
import os
|
||||||
|
import errno
|
||||||
|
import sys
|
||||||
|
|
||||||
|
|
||||||
|
def _makedirs_31(path, exist_ok=False):
|
||||||
|
try:
|
||||||
|
os.makedirs(path)
|
||||||
|
except OSError as exc:
|
||||||
|
if not exist_ok or exc.errno != errno.EEXIST:
|
||||||
|
raise
|
||||||
|
|
||||||
|
|
||||||
|
# rely on compatibility behavior until mode considerations
|
||||||
|
# and exists_ok considerations are disentangled.
|
||||||
|
# See https://github.com/pypa/setuptools/pull/1083#issuecomment-315168663
|
||||||
|
needs_makedirs = (
|
||||||
|
sys.version_info < (3, 2, 5) or
|
||||||
|
(3, 3) <= sys.version_info < (3, 3, 6) or
|
||||||
|
(3, 4) <= sys.version_info < (3, 4, 1)
|
||||||
|
)
|
||||||
|
makedirs = _makedirs_31 if needs_makedirs else os.makedirs
|
||||||
@@ -0,0 +1,36 @@
|
|||||||
|
.. image:: https://img.shields.io/pypi/v/setuptools.svg
|
||||||
|
:target: https://pypi.org/project/setuptools
|
||||||
|
|
||||||
|
.. image:: https://readthedocs.org/projects/setuptools/badge/?version=latest
|
||||||
|
:target: https://setuptools.readthedocs.io
|
||||||
|
|
||||||
|
.. image:: https://img.shields.io/travis/pypa/setuptools/master.svg?label=Linux%20build%20%40%20Travis%20CI
|
||||||
|
:target: https://travis-ci.org/pypa/setuptools
|
||||||
|
|
||||||
|
.. image:: https://img.shields.io/appveyor/ci/jaraco/setuptools/master.svg?label=Windows%20build%20%40%20Appveyor
|
||||||
|
:target: https://ci.appveyor.com/project/jaraco/setuptools/branch/master
|
||||||
|
|
||||||
|
.. image:: https://img.shields.io/pypi/pyversions/setuptools.svg
|
||||||
|
|
||||||
|
See the `Installation Instructions
|
||||||
|
<https://packaging.python.org/installing/>`_ in the Python Packaging
|
||||||
|
User's Guide for instructions on installing, upgrading, and uninstalling
|
||||||
|
Setuptools.
|
||||||
|
|
||||||
|
The project is `maintained at GitHub <https://github.com/pypa/setuptools>`_.
|
||||||
|
|
||||||
|
Questions and comments should be directed to the `distutils-sig
|
||||||
|
mailing list <http://mail.python.org/pipermail/distutils-sig/>`_.
|
||||||
|
Bug reports and especially tested patches may be
|
||||||
|
submitted directly to the `bug tracker
|
||||||
|
<https://github.com/pypa/setuptools/issues>`_.
|
||||||
|
|
||||||
|
|
||||||
|
Code of Conduct
|
||||||
|
---------------
|
||||||
|
|
||||||
|
Everyone interacting in the setuptools project's codebases, issue trackers,
|
||||||
|
chat rooms, and mailing lists is expected to follow the
|
||||||
|
`PyPA Code of Conduct <https://www.pypa.io/en/latest/code-of-conduct/>`_.
|
||||||
|
|
||||||
|
|
||||||
@@ -0,0 +1 @@
|
|||||||
|
pip
|
||||||
@@ -0,0 +1,19 @@
|
|||||||
|
Copyright (C) 2016 Jason R Coombs <jaraco@jaraco.com>
|
||||||
|
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining a copy of
|
||||||
|
this software and associated documentation files (the "Software"), to deal in
|
||||||
|
the Software without restriction, including without limitation the rights to
|
||||||
|
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies
|
||||||
|
of the Software, and to permit persons to whom the Software is furnished to do
|
||||||
|
so, subject to the following conditions:
|
||||||
|
|
||||||
|
The above copyright notice and this permission notice shall be included in all
|
||||||
|
copies or substantial portions of the Software.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||||
|
SOFTWARE.
|
||||||
@@ -0,0 +1,71 @@
|
|||||||
|
Metadata-Version: 2.0
|
||||||
|
Name: setuptools
|
||||||
|
Version: 39.0.1
|
||||||
|
Summary: Easily download, build, install, upgrade, and uninstall Python packages
|
||||||
|
Home-page: https://github.com/pypa/setuptools
|
||||||
|
Author: Python Packaging Authority
|
||||||
|
Author-email: distutils-sig@python.org
|
||||||
|
License: UNKNOWN
|
||||||
|
Project-URL: Documentation, https://setuptools.readthedocs.io/
|
||||||
|
Keywords: CPAN PyPI distutils eggs package management
|
||||||
|
Platform: UNKNOWN
|
||||||
|
Classifier: Development Status :: 5 - Production/Stable
|
||||||
|
Classifier: Intended Audience :: Developers
|
||||||
|
Classifier: License :: OSI Approved :: MIT License
|
||||||
|
Classifier: Operating System :: OS Independent
|
||||||
|
Classifier: Programming Language :: Python :: 2
|
||||||
|
Classifier: Programming Language :: Python :: 2.7
|
||||||
|
Classifier: Programming Language :: Python :: 3
|
||||||
|
Classifier: Programming Language :: Python :: 3.3
|
||||||
|
Classifier: Programming Language :: Python :: 3.4
|
||||||
|
Classifier: Programming Language :: Python :: 3.5
|
||||||
|
Classifier: Programming Language :: Python :: 3.6
|
||||||
|
Classifier: Topic :: Software Development :: Libraries :: Python Modules
|
||||||
|
Classifier: Topic :: System :: Archiving :: Packaging
|
||||||
|
Classifier: Topic :: System :: Systems Administration
|
||||||
|
Classifier: Topic :: Utilities
|
||||||
|
Requires-Python: >=2.7,!=3.0.*,!=3.1.*,!=3.2.*
|
||||||
|
Description-Content-Type: text/x-rst; charset=UTF-8
|
||||||
|
Provides-Extra: certs
|
||||||
|
Provides-Extra: ssl
|
||||||
|
Provides-Extra: certs
|
||||||
|
Requires-Dist: certifi (==2016.9.26); extra == 'certs'
|
||||||
|
Provides-Extra: ssl
|
||||||
|
Requires-Dist: wincertstore (==0.2); sys_platform=='win32' and extra == 'ssl'
|
||||||
|
|
||||||
|
.. image:: https://img.shields.io/pypi/v/setuptools.svg
|
||||||
|
:target: https://pypi.org/project/setuptools
|
||||||
|
|
||||||
|
.. image:: https://readthedocs.org/projects/setuptools/badge/?version=latest
|
||||||
|
:target: https://setuptools.readthedocs.io
|
||||||
|
|
||||||
|
.. image:: https://img.shields.io/travis/pypa/setuptools/master.svg?label=Linux%20build%20%40%20Travis%20CI
|
||||||
|
:target: https://travis-ci.org/pypa/setuptools
|
||||||
|
|
||||||
|
.. image:: https://img.shields.io/appveyor/ci/jaraco/setuptools/master.svg?label=Windows%20build%20%40%20Appveyor
|
||||||
|
:target: https://ci.appveyor.com/project/jaraco/setuptools/branch/master
|
||||||
|
|
||||||
|
.. image:: https://img.shields.io/pypi/pyversions/setuptools.svg
|
||||||
|
|
||||||
|
See the `Installation Instructions
|
||||||
|
<https://packaging.python.org/installing/>`_ in the Python Packaging
|
||||||
|
User's Guide for instructions on installing, upgrading, and uninstalling
|
||||||
|
Setuptools.
|
||||||
|
|
||||||
|
The project is `maintained at GitHub <https://github.com/pypa/setuptools>`_.
|
||||||
|
|
||||||
|
Questions and comments should be directed to the `distutils-sig
|
||||||
|
mailing list <http://mail.python.org/pipermail/distutils-sig/>`_.
|
||||||
|
Bug reports and especially tested patches may be
|
||||||
|
submitted directly to the `bug tracker
|
||||||
|
<https://github.com/pypa/setuptools/issues>`_.
|
||||||
|
|
||||||
|
|
||||||
|
Code of Conduct
|
||||||
|
---------------
|
||||||
|
|
||||||
|
Everyone interacting in the setuptools project's codebases, issue trackers,
|
||||||
|
chat rooms, and mailing lists is expected to follow the
|
||||||
|
`PyPA Code of Conduct <https://www.pypa.io/en/latest/code-of-conduct/>`_.
|
||||||
|
|
||||||
|
|
||||||
@@ -0,0 +1,188 @@
|
|||||||
|
easy_install.py,sha256=MDC9vt5AxDsXX5qcKlBz2TnW6Tpuv_AobnfhCJ9X3PM,126
|
||||||
|
pkg_resources/__init__.py,sha256=YQ4_WQnPztMsUy1yuvp7ZRBPK9IhOyhgosLpvkFso1I,103551
|
||||||
|
pkg_resources/py31compat.py,sha256=-ysVqoxLetAnL94uM0kHkomKQTC1JZLN2ZUjqUhMeKE,600
|
||||||
|
pkg_resources/_vendor/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
|
||||||
|
pkg_resources/_vendor/appdirs.py,sha256=tgGaL0m4Jo2VeuGfoOOifLv7a7oUEJu2n1vRkqoPw-0,22374
|
||||||
|
pkg_resources/_vendor/pyparsing.py,sha256=PifeLY3-WhIcBVzLtv0U4T_pwDtPruBhBCkg5vLqa28,229867
|
||||||
|
pkg_resources/_vendor/six.py,sha256=A6hdJZVjI3t_geebZ9BzUvwRrIXo0lfwzQlM2LcKyas,30098
|
||||||
|
pkg_resources/_vendor/packaging/__about__.py,sha256=zkcCPTN_6TcLW0Nrlg0176-R1QQ_WVPTm8sz1R4-HjM,720
|
||||||
|
pkg_resources/_vendor/packaging/__init__.py,sha256=_vNac5TrzwsrzbOFIbF-5cHqc_Y2aPT2D7zrIR06BOo,513
|
||||||
|
pkg_resources/_vendor/packaging/_compat.py,sha256=Vi_A0rAQeHbU-a9X0tt1yQm9RqkgQbDSxzRw8WlU9kA,860
|
||||||
|
pkg_resources/_vendor/packaging/_structures.py,sha256=RImECJ4c_wTlaTYYwZYLHEiebDMaAJmK1oPARhw1T5o,1416
|
||||||
|
pkg_resources/_vendor/packaging/markers.py,sha256=uEcBBtGvzqltgnArqb9c4RrcInXezDLos14zbBHhWJo,8248
|
||||||
|
pkg_resources/_vendor/packaging/requirements.py,sha256=SikL2UynbsT0qtY9ltqngndha_sfo0w6XGFhAhoSoaQ,4355
|
||||||
|
pkg_resources/_vendor/packaging/specifiers.py,sha256=SAMRerzO3fK2IkFZCaZkuwZaL_EGqHNOz4pni4vhnN0,28025
|
||||||
|
pkg_resources/_vendor/packaging/utils.py,sha256=3m6WvPm6NNxE8rkTGmn0r75B_GZSGg7ikafxHsBN1WA,421
|
||||||
|
pkg_resources/_vendor/packaging/version.py,sha256=OwGnxYfr2ghNzYx59qWIBkrK3SnB6n-Zfd1XaLpnnM0,11556
|
||||||
|
pkg_resources/extern/__init__.py,sha256=JUtlHHvlxHSNuB4pWqNjcx7n6kG-fwXg7qmJ2zNJlIY,2487
|
||||||
|
setuptools/__init__.py,sha256=WWIdCbFJnZ9fZoaWDN_x1vDA_Rkm-Sc15iKvPtIYKFs,5700
|
||||||
|
setuptools/archive_util.py,sha256=kw8Ib_lKjCcnPKNbS7h8HztRVK0d5RacU3r_KRdVnmM,6592
|
||||||
|
setuptools/build_meta.py,sha256=FllaKTr1vSJyiUeRjVJEZmeEaRzhYueNlimtcwaJba8,5671
|
||||||
|
setuptools/cli-32.exe,sha256=dfEuovMNnA2HLa3jRfMPVi5tk4R7alCbpTvuxtCyw0Y,65536
|
||||||
|
setuptools/cli-64.exe,sha256=KLABu5pyrnokJCv6skjXZ6GsXeyYHGcqOUT3oHI3Xpo,74752
|
||||||
|
setuptools/cli.exe,sha256=dfEuovMNnA2HLa3jRfMPVi5tk4R7alCbpTvuxtCyw0Y,65536
|
||||||
|
setuptools/config.py,sha256=tVYBM3w1U_uBRRTOZydflxyZ_IrTJT5odlZz3cbuhSw,16381
|
||||||
|
setuptools/dep_util.py,sha256=fgixvC1R7sH3r13ktyf7N0FALoqEXL1cBarmNpSEoWg,935
|
||||||
|
setuptools/depends.py,sha256=hC8QIDcM3VDpRXvRVA6OfL9AaQfxvhxHcN_w6sAyNq8,5837
|
||||||
|
setuptools/dist.py,sha256=_wCSFiGqwyaOUTj0tBjqZF2bqW9aEVu4W1D4gmsveno,42514
|
||||||
|
setuptools/extension.py,sha256=uc6nHI-MxwmNCNPbUiBnybSyqhpJqjbhvOQ-emdvt_E,1729
|
||||||
|
setuptools/glibc.py,sha256=X64VvGPL2AbURKwYRsWJOXXGAYOiF_v2qixeTkAULuU,3146
|
||||||
|
setuptools/glob.py,sha256=Y-fpv8wdHZzv9DPCaGACpMSBWJ6amq_1e0R_i8_el4w,5207
|
||||||
|
setuptools/gui-32.exe,sha256=XBr0bHMA6Hpz2s9s9Bzjl-PwXfa9nH4ie0rFn4V2kWA,65536
|
||||||
|
setuptools/gui-64.exe,sha256=aYKMhX1IJLn4ULHgWX0sE0yREUt6B3TEHf_jOw6yNyE,75264
|
||||||
|
setuptools/gui.exe,sha256=XBr0bHMA6Hpz2s9s9Bzjl-PwXfa9nH4ie0rFn4V2kWA,65536
|
||||||
|
setuptools/launch.py,sha256=sd7ejwhBocCDx_wG9rIs0OaZ8HtmmFU8ZC6IR_S0Lvg,787
|
||||||
|
setuptools/lib2to3_ex.py,sha256=t5e12hbR2pi9V4ezWDTB4JM-AISUnGOkmcnYHek3xjg,2013
|
||||||
|
setuptools/monkey.py,sha256=zZGTH7p0xeXQKLmEwJTPIE4m5m7fJeHoAsxyv5M8e_E,5789
|
||||||
|
setuptools/msvc.py,sha256=8EiV9ypb3EQJQssPcH1HZbdNsbRvqsFnJ7wPFEGwFIo,40877
|
||||||
|
setuptools/namespaces.py,sha256=F0Nrbv8KCT2OrO7rwa03om4N4GZKAlnce-rr-cgDQa8,3199
|
||||||
|
setuptools/package_index.py,sha256=NEsrNXnt_9gGP-nCCYzV-0gk15lXAGO7RghRxpfqLqE,40142
|
||||||
|
setuptools/pep425tags.py,sha256=NuGMx1gGif7x6iYemh0LfgBr_FZF5GFORIbgmMdU8J4,10882
|
||||||
|
setuptools/py27compat.py,sha256=3mwxRMDk5Q5O1rSXOERbQDXhFqwDJhhUitfMW_qpUCo,536
|
||||||
|
setuptools/py31compat.py,sha256=XuU1HCsGE_3zGvBRIhYw2iB-IhCFK4-Pxw_jMiqdNVk,1192
|
||||||
|
setuptools/py33compat.py,sha256=NKS84nl4LjLIoad6OQfgmygZn4mMvrok_b1N1tzebew,1182
|
||||||
|
setuptools/py36compat.py,sha256=VUDWxmu5rt4QHlGTRtAFu6W5jvfL6WBjeDAzeoBy0OM,2891
|
||||||
|
setuptools/sandbox.py,sha256=9UbwfEL5QY436oMI1LtFWohhoZ-UzwHvGyZjUH_qhkw,14276
|
||||||
|
setuptools/script (dev).tmpl,sha256=f7MR17dTkzaqkCMSVseyOCMVrPVSMdmTQsaB8cZzfuI,201
|
||||||
|
setuptools/script.tmpl,sha256=WGTt5piezO27c-Dbx6l5Q4T3Ff20A5z7872hv3aAhYY,138
|
||||||
|
setuptools/site-patch.py,sha256=BVt6yIrDMXJoflA5J6DJIcsJUfW_XEeVhOzelTTFDP4,2307
|
||||||
|
setuptools/ssl_support.py,sha256=YBDJsCZjSp62CWjxmSkke9kn9rhHHj25Cus6zhJRW3c,8492
|
||||||
|
setuptools/unicode_utils.py,sha256=NOiZ_5hD72A6w-4wVj8awHFM3n51Kmw1Ic_vx15XFqw,996
|
||||||
|
setuptools/version.py,sha256=og_cuZQb0QI6ukKZFfZWPlr1HgJBPPn2vO2m_bI9ZTE,144
|
||||||
|
setuptools/wheel.py,sha256=yF9usxMvpwnymV-oOo5mfDiv3E8jrKkbDEItT7_kjBs,7230
|
||||||
|
setuptools/windows_support.py,sha256=5GrfqSP2-dLGJoZTq2g6dCKkyQxxa2n5IQiXlJCoYEE,714
|
||||||
|
setuptools/_vendor/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
|
||||||
|
setuptools/_vendor/pyparsing.py,sha256=PifeLY3-WhIcBVzLtv0U4T_pwDtPruBhBCkg5vLqa28,229867
|
||||||
|
setuptools/_vendor/six.py,sha256=A6hdJZVjI3t_geebZ9BzUvwRrIXo0lfwzQlM2LcKyas,30098
|
||||||
|
setuptools/_vendor/packaging/__about__.py,sha256=zkcCPTN_6TcLW0Nrlg0176-R1QQ_WVPTm8sz1R4-HjM,720
|
||||||
|
setuptools/_vendor/packaging/__init__.py,sha256=_vNac5TrzwsrzbOFIbF-5cHqc_Y2aPT2D7zrIR06BOo,513
|
||||||
|
setuptools/_vendor/packaging/_compat.py,sha256=Vi_A0rAQeHbU-a9X0tt1yQm9RqkgQbDSxzRw8WlU9kA,860
|
||||||
|
setuptools/_vendor/packaging/_structures.py,sha256=RImECJ4c_wTlaTYYwZYLHEiebDMaAJmK1oPARhw1T5o,1416
|
||||||
|
setuptools/_vendor/packaging/markers.py,sha256=Gvpk9EY20yKaMTiKgQZ8yFEEpodqVgVYtfekoic1Yts,8239
|
||||||
|
setuptools/_vendor/packaging/requirements.py,sha256=t44M2HVWtr8phIz2OhnILzuGT3rTATaovctV1dpnVIg,4343
|
||||||
|
setuptools/_vendor/packaging/specifiers.py,sha256=SAMRerzO3fK2IkFZCaZkuwZaL_EGqHNOz4pni4vhnN0,28025
|
||||||
|
setuptools/_vendor/packaging/utils.py,sha256=3m6WvPm6NNxE8rkTGmn0r75B_GZSGg7ikafxHsBN1WA,421
|
||||||
|
setuptools/_vendor/packaging/version.py,sha256=OwGnxYfr2ghNzYx59qWIBkrK3SnB6n-Zfd1XaLpnnM0,11556
|
||||||
|
setuptools/command/__init__.py,sha256=NWzJ0A1BEengZpVeqUyWLNm2bk4P3F4iL5QUErHy7kA,594
|
||||||
|
setuptools/command/alias.py,sha256=KjpE0sz_SDIHv3fpZcIQK-sCkJz-SrC6Gmug6b9Nkc8,2426
|
||||||
|
setuptools/command/bdist_egg.py,sha256=RQ9h8BmSVpXKJQST3i_b_sm093Z-aCXbfMBEM2IrI-Q,18185
|
||||||
|
setuptools/command/bdist_rpm.py,sha256=B7l0TnzCGb-0nLlm6rS00jWLkojASwVmdhW2w5Qz_Ak,1508
|
||||||
|
setuptools/command/bdist_wininst.py,sha256=_6dz3lpB1tY200LxKPLM7qgwTCceOMgaWFF-jW2-pm0,637
|
||||||
|
setuptools/command/build_clib.py,sha256=bQ9aBr-5ZSO-9fGsGsDLz0mnnFteHUZnftVLkhvHDq0,4484
|
||||||
|
setuptools/command/build_ext.py,sha256=PCRAZ2xYnqyEof7EFNtpKYl0sZzT0qdKUNTH3sUdPqk,13173
|
||||||
|
setuptools/command/build_py.py,sha256=yWyYaaS9F3o9JbIczn064A5g1C5_UiKRDxGaTqYbtLE,9596
|
||||||
|
setuptools/command/develop.py,sha256=wKbOw2_qUvcDti2lZmtxbDmYb54yAAibExzXIvToz-A,8046
|
||||||
|
setuptools/command/dist_info.py,sha256=5t6kOfrdgALT-P3ogss6PF9k-Leyesueycuk3dUyZnI,960
|
||||||
|
setuptools/command/easy_install.py,sha256=I0UOqFrS9U7fmh0uW57IR37keMKSeqXp6z61Oz1nEoA,87054
|
||||||
|
setuptools/command/egg_info.py,sha256=3b5Y3t_bl_zZRCkmlGi3igvRze9oOaxd-dVf2w1FBOc,24800
|
||||||
|
setuptools/command/install.py,sha256=a0EZpL_A866KEdhicTGbuyD_TYl1sykfzdrri-zazT4,4683
|
||||||
|
setuptools/command/install_egg_info.py,sha256=bMgeIeRiXzQ4DAGPV1328kcjwQjHjOWU4FngAWLV78Q,2203
|
||||||
|
setuptools/command/install_lib.py,sha256=11mxf0Ch12NsuYwS8PHwXBRvyh671QAM4cTRh7epzG0,3840
|
||||||
|
setuptools/command/install_scripts.py,sha256=UD0rEZ6861mTYhIdzcsqKnUl8PozocXWl9VBQ1VTWnc,2439
|
||||||
|
setuptools/command/launcher manifest.xml,sha256=xlLbjWrB01tKC0-hlVkOKkiSPbzMml2eOPtJ_ucCnbE,628
|
||||||
|
setuptools/command/py36compat.py,sha256=SzjZcOxF7zdFUT47Zv2n7AM3H8koDys_0OpS-n9gIfc,4986
|
||||||
|
setuptools/command/register.py,sha256=bHlMm1qmBbSdahTOT8w6UhA-EgeQIz7p6cD-qOauaiI,270
|
||||||
|
setuptools/command/rotate.py,sha256=co5C1EkI7P0GGT6Tqz-T2SIj2LBJTZXYELpmao6d4KQ,2164
|
||||||
|
setuptools/command/saveopts.py,sha256=za7QCBcQimKKriWcoCcbhxPjUz30gSB74zuTL47xpP4,658
|
||||||
|
setuptools/command/sdist.py,sha256=obDTe2BmWt2PlnFPZZh7e0LWvemEsbCCO9MzhrTZjm8,6711
|
||||||
|
setuptools/command/setopt.py,sha256=NTWDyx-gjDF-txf4dO577s7LOzHVoKR0Mq33rFxaRr8,5085
|
||||||
|
setuptools/command/test.py,sha256=MeBAcXUePGjPKqjz4zvTrHatLvNsjlPFcagt3XnFYdk,9214
|
||||||
|
setuptools/command/upload.py,sha256=i1gfItZ3nQOn5FKXb8tLC2Kd7eKC8lWO4bdE6NqGpE4,1172
|
||||||
|
setuptools/command/upload_docs.py,sha256=oXiGplM_cUKLwE4CWWw98RzCufAu8tBhMC97GegFcms,7311
|
||||||
|
setuptools/extern/__init__.py,sha256=2eKMsBMwsZqolIcYBtLZU3t96s6xSTP4PTaNfM5P-I0,2499
|
||||||
|
setuptools-39.0.1.dist-info/DESCRIPTION.rst,sha256=It3a3GRjT5701mqhrpMcLyW_YS2Dokv-X8zWoTaMRe0,1422
|
||||||
|
setuptools-39.0.1.dist-info/LICENSE.txt,sha256=wyo6w5WvYyHv0ovnPQagDw22q4h9HCHU_sRhKNIFbVo,1078
|
||||||
|
setuptools-39.0.1.dist-info/METADATA,sha256=bUSvsq3nbwr4FDQmI4Cu1Sd17lRO4y4MFANuLmZ70gs,2903
|
||||||
|
setuptools-39.0.1.dist-info/RECORD,,
|
||||||
|
setuptools-39.0.1.dist-info/WHEEL,sha256=kdsN-5OJAZIiHN-iO4Rhl82KyS0bDWf4uBwMbkNafr8,110
|
||||||
|
setuptools-39.0.1.dist-info/dependency_links.txt,sha256=HlkCFkoK5TbZ5EMLbLKYhLcY_E31kBWD8TqW2EgmatQ,239
|
||||||
|
setuptools-39.0.1.dist-info/entry_points.txt,sha256=jBqCYDlVjl__sjYFGXo1JQGIMAYFJE-prYWUtnMZEew,2990
|
||||||
|
setuptools-39.0.1.dist-info/metadata.json,sha256=kJuHY3HestbJAAqqkLVW75x2Uxgxd2qaz4sQAfFCtXM,4969
|
||||||
|
setuptools-39.0.1.dist-info/top_level.txt,sha256=2HUXVVwA4Pff1xgTFr3GsTXXKaPaO6vlG6oNJ_4u4Tg,38
|
||||||
|
setuptools-39.0.1.dist-info/zip-safe,sha256=AbpHGcgLb-kRsJGnwFEktk7uzpZOCcBY74-YBdrKVGs,1
|
||||||
|
../../Scripts/easy_install.exe,sha256=H1pR71NDkAyfC5a4zTB-XWb2ht0_C6jsWA2_QAYXFiU,102879
|
||||||
|
../../Scripts/easy_install-3.7.exe,sha256=H1pR71NDkAyfC5a4zTB-XWb2ht0_C6jsWA2_QAYXFiU,102879
|
||||||
|
setuptools-39.0.1.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
|
||||||
|
pkg_resources/extern/__pycache__/__init__.cpython-37.pyc,,
|
||||||
|
pkg_resources/_vendor/packaging/__pycache__/markers.cpython-37.pyc,,
|
||||||
|
pkg_resources/_vendor/packaging/__pycache__/requirements.cpython-37.pyc,,
|
||||||
|
pkg_resources/_vendor/packaging/__pycache__/specifiers.cpython-37.pyc,,
|
||||||
|
pkg_resources/_vendor/packaging/__pycache__/utils.cpython-37.pyc,,
|
||||||
|
pkg_resources/_vendor/packaging/__pycache__/version.cpython-37.pyc,,
|
||||||
|
pkg_resources/_vendor/packaging/__pycache__/_compat.cpython-37.pyc,,
|
||||||
|
pkg_resources/_vendor/packaging/__pycache__/_structures.cpython-37.pyc,,
|
||||||
|
pkg_resources/_vendor/packaging/__pycache__/__about__.cpython-37.pyc,,
|
||||||
|
pkg_resources/_vendor/packaging/__pycache__/__init__.cpython-37.pyc,,
|
||||||
|
pkg_resources/_vendor/__pycache__/appdirs.cpython-37.pyc,,
|
||||||
|
pkg_resources/_vendor/__pycache__/pyparsing.cpython-37.pyc,,
|
||||||
|
pkg_resources/_vendor/__pycache__/six.cpython-37.pyc,,
|
||||||
|
pkg_resources/_vendor/__pycache__/__init__.cpython-37.pyc,,
|
||||||
|
pkg_resources/__pycache__/py31compat.cpython-37.pyc,,
|
||||||
|
pkg_resources/__pycache__/__init__.cpython-37.pyc,,
|
||||||
|
setuptools/command/__pycache__/alias.cpython-37.pyc,,
|
||||||
|
setuptools/command/__pycache__/bdist_egg.cpython-37.pyc,,
|
||||||
|
setuptools/command/__pycache__/bdist_rpm.cpython-37.pyc,,
|
||||||
|
setuptools/command/__pycache__/bdist_wininst.cpython-37.pyc,,
|
||||||
|
setuptools/command/__pycache__/build_clib.cpython-37.pyc,,
|
||||||
|
setuptools/command/__pycache__/build_ext.cpython-37.pyc,,
|
||||||
|
setuptools/command/__pycache__/build_py.cpython-37.pyc,,
|
||||||
|
setuptools/command/__pycache__/develop.cpython-37.pyc,,
|
||||||
|
setuptools/command/__pycache__/dist_info.cpython-37.pyc,,
|
||||||
|
setuptools/command/__pycache__/easy_install.cpython-37.pyc,,
|
||||||
|
setuptools/command/__pycache__/egg_info.cpython-37.pyc,,
|
||||||
|
setuptools/command/__pycache__/install.cpython-37.pyc,,
|
||||||
|
setuptools/command/__pycache__/install_egg_info.cpython-37.pyc,,
|
||||||
|
setuptools/command/__pycache__/install_lib.cpython-37.pyc,,
|
||||||
|
setuptools/command/__pycache__/install_scripts.cpython-37.pyc,,
|
||||||
|
setuptools/command/__pycache__/py36compat.cpython-37.pyc,,
|
||||||
|
setuptools/command/__pycache__/register.cpython-37.pyc,,
|
||||||
|
setuptools/command/__pycache__/rotate.cpython-37.pyc,,
|
||||||
|
setuptools/command/__pycache__/saveopts.cpython-37.pyc,,
|
||||||
|
setuptools/command/__pycache__/sdist.cpython-37.pyc,,
|
||||||
|
setuptools/command/__pycache__/setopt.cpython-37.pyc,,
|
||||||
|
setuptools/command/__pycache__/test.cpython-37.pyc,,
|
||||||
|
setuptools/command/__pycache__/upload.cpython-37.pyc,,
|
||||||
|
setuptools/command/__pycache__/upload_docs.cpython-37.pyc,,
|
||||||
|
setuptools/command/__pycache__/__init__.cpython-37.pyc,,
|
||||||
|
setuptools/extern/__pycache__/__init__.cpython-37.pyc,,
|
||||||
|
setuptools/_vendor/packaging/__pycache__/markers.cpython-37.pyc,,
|
||||||
|
setuptools/_vendor/packaging/__pycache__/requirements.cpython-37.pyc,,
|
||||||
|
setuptools/_vendor/packaging/__pycache__/specifiers.cpython-37.pyc,,
|
||||||
|
setuptools/_vendor/packaging/__pycache__/utils.cpython-37.pyc,,
|
||||||
|
setuptools/_vendor/packaging/__pycache__/version.cpython-37.pyc,,
|
||||||
|
setuptools/_vendor/packaging/__pycache__/_compat.cpython-37.pyc,,
|
||||||
|
setuptools/_vendor/packaging/__pycache__/_structures.cpython-37.pyc,,
|
||||||
|
setuptools/_vendor/packaging/__pycache__/__about__.cpython-37.pyc,,
|
||||||
|
setuptools/_vendor/packaging/__pycache__/__init__.cpython-37.pyc,,
|
||||||
|
setuptools/_vendor/__pycache__/pyparsing.cpython-37.pyc,,
|
||||||
|
setuptools/_vendor/__pycache__/six.cpython-37.pyc,,
|
||||||
|
setuptools/_vendor/__pycache__/__init__.cpython-37.pyc,,
|
||||||
|
setuptools/__pycache__/archive_util.cpython-37.pyc,,
|
||||||
|
setuptools/__pycache__/build_meta.cpython-37.pyc,,
|
||||||
|
setuptools/__pycache__/config.cpython-37.pyc,,
|
||||||
|
setuptools/__pycache__/depends.cpython-37.pyc,,
|
||||||
|
setuptools/__pycache__/dep_util.cpython-37.pyc,,
|
||||||
|
setuptools/__pycache__/dist.cpython-37.pyc,,
|
||||||
|
setuptools/__pycache__/extension.cpython-37.pyc,,
|
||||||
|
setuptools/__pycache__/glibc.cpython-37.pyc,,
|
||||||
|
setuptools/__pycache__/glob.cpython-37.pyc,,
|
||||||
|
setuptools/__pycache__/launch.cpython-37.pyc,,
|
||||||
|
setuptools/__pycache__/lib2to3_ex.cpython-37.pyc,,
|
||||||
|
setuptools/__pycache__/monkey.cpython-37.pyc,,
|
||||||
|
setuptools/__pycache__/msvc.cpython-37.pyc,,
|
||||||
|
setuptools/__pycache__/namespaces.cpython-37.pyc,,
|
||||||
|
setuptools/__pycache__/package_index.cpython-37.pyc,,
|
||||||
|
setuptools/__pycache__/pep425tags.cpython-37.pyc,,
|
||||||
|
setuptools/__pycache__/py27compat.cpython-37.pyc,,
|
||||||
|
setuptools/__pycache__/py31compat.cpython-37.pyc,,
|
||||||
|
setuptools/__pycache__/py33compat.cpython-37.pyc,,
|
||||||
|
setuptools/__pycache__/py36compat.cpython-37.pyc,,
|
||||||
|
setuptools/__pycache__/sandbox.cpython-37.pyc,,
|
||||||
|
setuptools/__pycache__/site-patch.cpython-37.pyc,,
|
||||||
|
setuptools/__pycache__/ssl_support.cpython-37.pyc,,
|
||||||
|
setuptools/__pycache__/unicode_utils.cpython-37.pyc,,
|
||||||
|
setuptools/__pycache__/version.cpython-37.pyc,,
|
||||||
|
setuptools/__pycache__/wheel.cpython-37.pyc,,
|
||||||
|
setuptools/__pycache__/windows_support.cpython-37.pyc,,
|
||||||
|
setuptools/__pycache__/__init__.cpython-37.pyc,,
|
||||||
|
__pycache__/easy_install.cpython-37.pyc,,
|
||||||
@@ -0,0 +1,6 @@
|
|||||||
|
Wheel-Version: 1.0
|
||||||
|
Generator: bdist_wheel (0.30.0)
|
||||||
|
Root-Is-Purelib: true
|
||||||
|
Tag: py2-none-any
|
||||||
|
Tag: py3-none-any
|
||||||
|
|
||||||
@@ -0,0 +1,2 @@
|
|||||||
|
https://files.pythonhosted.org/packages/source/c/certifi/certifi-2016.9.26.tar.gz#md5=baa81e951a29958563689d868ef1064d
|
||||||
|
https://files.pythonhosted.org/packages/source/w/wincertstore/wincertstore-0.2.zip#md5=ae728f2f007185648d0c7a8679b361e2
|
||||||
@@ -0,0 +1,65 @@
|
|||||||
|
[console_scripts]
|
||||||
|
easy_install = setuptools.command.easy_install:main
|
||||||
|
easy_install-3.6 = setuptools.command.easy_install:main
|
||||||
|
|
||||||
|
[distutils.commands]
|
||||||
|
alias = setuptools.command.alias:alias
|
||||||
|
bdist_egg = setuptools.command.bdist_egg:bdist_egg
|
||||||
|
bdist_rpm = setuptools.command.bdist_rpm:bdist_rpm
|
||||||
|
bdist_wininst = setuptools.command.bdist_wininst:bdist_wininst
|
||||||
|
build_clib = setuptools.command.build_clib:build_clib
|
||||||
|
build_ext = setuptools.command.build_ext:build_ext
|
||||||
|
build_py = setuptools.command.build_py:build_py
|
||||||
|
develop = setuptools.command.develop:develop
|
||||||
|
dist_info = setuptools.command.dist_info:dist_info
|
||||||
|
easy_install = setuptools.command.easy_install:easy_install
|
||||||
|
egg_info = setuptools.command.egg_info:egg_info
|
||||||
|
install = setuptools.command.install:install
|
||||||
|
install_egg_info = setuptools.command.install_egg_info:install_egg_info
|
||||||
|
install_lib = setuptools.command.install_lib:install_lib
|
||||||
|
install_scripts = setuptools.command.install_scripts:install_scripts
|
||||||
|
register = setuptools.command.register:register
|
||||||
|
rotate = setuptools.command.rotate:rotate
|
||||||
|
saveopts = setuptools.command.saveopts:saveopts
|
||||||
|
sdist = setuptools.command.sdist:sdist
|
||||||
|
setopt = setuptools.command.setopt:setopt
|
||||||
|
test = setuptools.command.test:test
|
||||||
|
upload = setuptools.command.upload:upload
|
||||||
|
upload_docs = setuptools.command.upload_docs:upload_docs
|
||||||
|
|
||||||
|
[distutils.setup_keywords]
|
||||||
|
convert_2to3_doctests = setuptools.dist:assert_string_list
|
||||||
|
dependency_links = setuptools.dist:assert_string_list
|
||||||
|
eager_resources = setuptools.dist:assert_string_list
|
||||||
|
entry_points = setuptools.dist:check_entry_points
|
||||||
|
exclude_package_data = setuptools.dist:check_package_data
|
||||||
|
extras_require = setuptools.dist:check_extras
|
||||||
|
include_package_data = setuptools.dist:assert_bool
|
||||||
|
install_requires = setuptools.dist:check_requirements
|
||||||
|
namespace_packages = setuptools.dist:check_nsp
|
||||||
|
package_data = setuptools.dist:check_package_data
|
||||||
|
packages = setuptools.dist:check_packages
|
||||||
|
python_requires = setuptools.dist:check_specifier
|
||||||
|
setup_requires = setuptools.dist:check_requirements
|
||||||
|
test_loader = setuptools.dist:check_importable
|
||||||
|
test_runner = setuptools.dist:check_importable
|
||||||
|
test_suite = setuptools.dist:check_test_suite
|
||||||
|
tests_require = setuptools.dist:check_requirements
|
||||||
|
use_2to3 = setuptools.dist:assert_bool
|
||||||
|
use_2to3_exclude_fixers = setuptools.dist:assert_string_list
|
||||||
|
use_2to3_fixers = setuptools.dist:assert_string_list
|
||||||
|
zip_safe = setuptools.dist:assert_bool
|
||||||
|
|
||||||
|
[egg_info.writers]
|
||||||
|
PKG-INFO = setuptools.command.egg_info:write_pkg_info
|
||||||
|
dependency_links.txt = setuptools.command.egg_info:overwrite_arg
|
||||||
|
depends.txt = setuptools.command.egg_info:warn_depends_obsolete
|
||||||
|
eager_resources.txt = setuptools.command.egg_info:overwrite_arg
|
||||||
|
entry_points.txt = setuptools.command.egg_info:write_entries
|
||||||
|
namespace_packages.txt = setuptools.command.egg_info:overwrite_arg
|
||||||
|
requires.txt = setuptools.command.egg_info:write_requirements
|
||||||
|
top_level.txt = setuptools.command.egg_info:write_toplevel_names
|
||||||
|
|
||||||
|
[setuptools.installation]
|
||||||
|
eggsecutable = setuptools.command.easy_install:bootstrap
|
||||||
|
|
||||||
@@ -0,0 +1 @@
|
|||||||
|
{"classifiers": ["Development Status :: 5 - Production/Stable", "Intended Audience :: Developers", "License :: OSI Approved :: MIT License", "Operating System :: OS Independent", "Programming Language :: Python :: 2", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.3", "Programming Language :: Python :: 3.4", "Programming Language :: Python :: 3.5", "Programming Language :: Python :: 3.6", "Topic :: Software Development :: Libraries :: Python Modules", "Topic :: System :: Archiving :: Packaging", "Topic :: System :: Systems Administration", "Topic :: Utilities"], "description_content_type": "text/x-rst; charset=UTF-8", "extensions": {"python.commands": {"wrap_console": {"easy_install": "setuptools.command.easy_install:main", "easy_install-3.6": "setuptools.command.easy_install:main"}}, "python.details": {"contacts": [{"email": "distutils-sig@python.org", "name": "Python Packaging Authority", "role": "author"}], "document_names": {"description": "DESCRIPTION.rst", "license": "LICENSE.txt"}, "project_urls": {"Home": "https://github.com/pypa/setuptools"}}, "python.exports": {"console_scripts": {"easy_install": "setuptools.command.easy_install:main", "easy_install-3.6": "setuptools.command.easy_install:main"}, "distutils.commands": {"alias": "setuptools.command.alias:alias", "bdist_egg": "setuptools.command.bdist_egg:bdist_egg", "bdist_rpm": "setuptools.command.bdist_rpm:bdist_rpm", "bdist_wininst": "setuptools.command.bdist_wininst:bdist_wininst", "build_clib": "setuptools.command.build_clib:build_clib", "build_ext": "setuptools.command.build_ext:build_ext", "build_py": "setuptools.command.build_py:build_py", "develop": "setuptools.command.develop:develop", "dist_info": "setuptools.command.dist_info:dist_info", "easy_install": "setuptools.command.easy_install:easy_install", "egg_info": "setuptools.command.egg_info:egg_info", "install": "setuptools.command.install:install", "install_egg_info": "setuptools.command.install_egg_info:install_egg_info", "install_lib": "setuptools.command.install_lib:install_lib", "install_scripts": "setuptools.command.install_scripts:install_scripts", "register": "setuptools.command.register:register", "rotate": "setuptools.command.rotate:rotate", "saveopts": "setuptools.command.saveopts:saveopts", "sdist": "setuptools.command.sdist:sdist", "setopt": "setuptools.command.setopt:setopt", "test": "setuptools.command.test:test", "upload": "setuptools.command.upload:upload", "upload_docs": "setuptools.command.upload_docs:upload_docs"}, "distutils.setup_keywords": {"convert_2to3_doctests": "setuptools.dist:assert_string_list", "dependency_links": "setuptools.dist:assert_string_list", "eager_resources": "setuptools.dist:assert_string_list", "entry_points": "setuptools.dist:check_entry_points", "exclude_package_data": "setuptools.dist:check_package_data", "extras_require": "setuptools.dist:check_extras", "include_package_data": "setuptools.dist:assert_bool", "install_requires": "setuptools.dist:check_requirements", "namespace_packages": "setuptools.dist:check_nsp", "package_data": "setuptools.dist:check_package_data", "packages": "setuptools.dist:check_packages", "python_requires": "setuptools.dist:check_specifier", "setup_requires": "setuptools.dist:check_requirements", "test_loader": "setuptools.dist:check_importable", "test_runner": "setuptools.dist:check_importable", "test_suite": "setuptools.dist:check_test_suite", "tests_require": "setuptools.dist:check_requirements", "use_2to3": "setuptools.dist:assert_bool", "use_2to3_exclude_fixers": "setuptools.dist:assert_string_list", "use_2to3_fixers": "setuptools.dist:assert_string_list", "zip_safe": "setuptools.dist:assert_bool"}, "egg_info.writers": {"PKG-INFO": "setuptools.command.egg_info:write_pkg_info", "dependency_links.txt": "setuptools.command.egg_info:overwrite_arg", "depends.txt": "setuptools.command.egg_info:warn_depends_obsolete", "eager_resources.txt": "setuptools.command.egg_info:overwrite_arg", "entry_points.txt": "setuptools.command.egg_info:write_entries", "namespace_packages.txt": "setuptools.command.egg_info:overwrite_arg", "requires.txt": "setuptools.command.egg_info:write_requirements", "top_level.txt": "setuptools.command.egg_info:write_toplevel_names"}, "setuptools.installation": {"eggsecutable": "setuptools.command.easy_install:bootstrap"}}}, "extras": ["certs", "ssl"], "generator": "bdist_wheel (0.30.0)", "keywords": ["CPAN", "PyPI", "distutils", "eggs", "package", "management"], "metadata_version": "2.0", "name": "setuptools", "project_url": "Documentation, https://setuptools.readthedocs.io/", "requires_python": ">=2.7,!=3.0.*,!=3.1.*,!=3.2.*", "run_requires": [{"extra": "certs", "requires": ["certifi (==2016.9.26)"]}, {"environment": "sys_platform=='win32'", "extra": "ssl", "requires": ["wincertstore (==0.2)"]}], "summary": "Easily download, build, install, upgrade, and uninstall Python packages", "version": "39.0.1"}
|
||||||
@@ -0,0 +1,3 @@
|
|||||||
|
easy_install
|
||||||
|
pkg_resources
|
||||||
|
setuptools
|
||||||
@@ -0,0 +1 @@
|
|||||||
|
|
||||||
@@ -0,0 +1,180 @@
|
|||||||
|
"""Extensions to the 'distutils' for large or complex distributions"""
|
||||||
|
|
||||||
|
import os
|
||||||
|
import functools
|
||||||
|
import distutils.core
|
||||||
|
import distutils.filelist
|
||||||
|
from distutils.util import convert_path
|
||||||
|
from fnmatch import fnmatchcase
|
||||||
|
|
||||||
|
from setuptools.extern.six.moves import filter, map
|
||||||
|
|
||||||
|
import setuptools.version
|
||||||
|
from setuptools.extension import Extension
|
||||||
|
from setuptools.dist import Distribution, Feature
|
||||||
|
from setuptools.depends import Require
|
||||||
|
from . import monkey
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
'setup', 'Distribution', 'Feature', 'Command', 'Extension', 'Require',
|
||||||
|
'find_packages',
|
||||||
|
]
|
||||||
|
|
||||||
|
__version__ = setuptools.version.__version__
|
||||||
|
|
||||||
|
bootstrap_install_from = None
|
||||||
|
|
||||||
|
# If we run 2to3 on .py files, should we also convert docstrings?
|
||||||
|
# Default: yes; assume that we can detect doctests reliably
|
||||||
|
run_2to3_on_doctests = True
|
||||||
|
# Standard package names for fixer packages
|
||||||
|
lib2to3_fixer_packages = ['lib2to3.fixes']
|
||||||
|
|
||||||
|
|
||||||
|
class PackageFinder(object):
|
||||||
|
"""
|
||||||
|
Generate a list of all Python packages found within a directory
|
||||||
|
"""
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def find(cls, where='.', exclude=(), include=('*',)):
|
||||||
|
"""Return a list all Python packages found within directory 'where'
|
||||||
|
|
||||||
|
'where' is the root directory which will be searched for packages. It
|
||||||
|
should be supplied as a "cross-platform" (i.e. URL-style) path; it will
|
||||||
|
be converted to the appropriate local path syntax.
|
||||||
|
|
||||||
|
'exclude' is a sequence of package names to exclude; '*' can be used
|
||||||
|
as a wildcard in the names, such that 'foo.*' will exclude all
|
||||||
|
subpackages of 'foo' (but not 'foo' itself).
|
||||||
|
|
||||||
|
'include' is a sequence of package names to include. If it's
|
||||||
|
specified, only the named packages will be included. If it's not
|
||||||
|
specified, all found packages will be included. 'include' can contain
|
||||||
|
shell style wildcard patterns just like 'exclude'.
|
||||||
|
"""
|
||||||
|
|
||||||
|
return list(cls._find_packages_iter(
|
||||||
|
convert_path(where),
|
||||||
|
cls._build_filter('ez_setup', '*__pycache__', *exclude),
|
||||||
|
cls._build_filter(*include)))
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def _find_packages_iter(cls, where, exclude, include):
|
||||||
|
"""
|
||||||
|
All the packages found in 'where' that pass the 'include' filter, but
|
||||||
|
not the 'exclude' filter.
|
||||||
|
"""
|
||||||
|
for root, dirs, files in os.walk(where, followlinks=True):
|
||||||
|
# Copy dirs to iterate over it, then empty dirs.
|
||||||
|
all_dirs = dirs[:]
|
||||||
|
dirs[:] = []
|
||||||
|
|
||||||
|
for dir in all_dirs:
|
||||||
|
full_path = os.path.join(root, dir)
|
||||||
|
rel_path = os.path.relpath(full_path, where)
|
||||||
|
package = rel_path.replace(os.path.sep, '.')
|
||||||
|
|
||||||
|
# Skip directory trees that are not valid packages
|
||||||
|
if ('.' in dir or not cls._looks_like_package(full_path)):
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Should this package be included?
|
||||||
|
if include(package) and not exclude(package):
|
||||||
|
yield package
|
||||||
|
|
||||||
|
# Keep searching subdirectories, as there may be more packages
|
||||||
|
# down there, even if the parent was excluded.
|
||||||
|
dirs.append(dir)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _looks_like_package(path):
|
||||||
|
"""Does a directory look like a package?"""
|
||||||
|
return os.path.isfile(os.path.join(path, '__init__.py'))
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _build_filter(*patterns):
|
||||||
|
"""
|
||||||
|
Given a list of patterns, return a callable that will be true only if
|
||||||
|
the input matches at least one of the patterns.
|
||||||
|
"""
|
||||||
|
return lambda name: any(fnmatchcase(name, pat=pat) for pat in patterns)
|
||||||
|
|
||||||
|
|
||||||
|
class PEP420PackageFinder(PackageFinder):
|
||||||
|
@staticmethod
|
||||||
|
def _looks_like_package(path):
|
||||||
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
find_packages = PackageFinder.find
|
||||||
|
|
||||||
|
|
||||||
|
def _install_setup_requires(attrs):
|
||||||
|
# Note: do not use `setuptools.Distribution` directly, as
|
||||||
|
# our PEP 517 backend patch `distutils.core.Distribution`.
|
||||||
|
dist = distutils.core.Distribution(dict(
|
||||||
|
(k, v) for k, v in attrs.items()
|
||||||
|
if k in ('dependency_links', 'setup_requires')
|
||||||
|
))
|
||||||
|
# Honor setup.cfg's options.
|
||||||
|
dist.parse_config_files(ignore_option_errors=True)
|
||||||
|
if dist.setup_requires:
|
||||||
|
dist.fetch_build_eggs(dist.setup_requires)
|
||||||
|
|
||||||
|
|
||||||
|
def setup(**attrs):
|
||||||
|
# Make sure we have any requirements needed to interpret 'attrs'.
|
||||||
|
_install_setup_requires(attrs)
|
||||||
|
return distutils.core.setup(**attrs)
|
||||||
|
|
||||||
|
setup.__doc__ = distutils.core.setup.__doc__
|
||||||
|
|
||||||
|
|
||||||
|
_Command = monkey.get_unpatched(distutils.core.Command)
|
||||||
|
|
||||||
|
|
||||||
|
class Command(_Command):
|
||||||
|
__doc__ = _Command.__doc__
|
||||||
|
|
||||||
|
command_consumes_arguments = False
|
||||||
|
|
||||||
|
def __init__(self, dist, **kw):
|
||||||
|
"""
|
||||||
|
Construct the command for dist, updating
|
||||||
|
vars(self) with any keyword parameters.
|
||||||
|
"""
|
||||||
|
_Command.__init__(self, dist)
|
||||||
|
vars(self).update(kw)
|
||||||
|
|
||||||
|
def reinitialize_command(self, command, reinit_subcommands=0, **kw):
|
||||||
|
cmd = _Command.reinitialize_command(self, command, reinit_subcommands)
|
||||||
|
vars(cmd).update(kw)
|
||||||
|
return cmd
|
||||||
|
|
||||||
|
|
||||||
|
def _find_all_simple(path):
|
||||||
|
"""
|
||||||
|
Find all files under 'path'
|
||||||
|
"""
|
||||||
|
results = (
|
||||||
|
os.path.join(base, file)
|
||||||
|
for base, dirs, files in os.walk(path, followlinks=True)
|
||||||
|
for file in files
|
||||||
|
)
|
||||||
|
return filter(os.path.isfile, results)
|
||||||
|
|
||||||
|
|
||||||
|
def findall(dir=os.curdir):
|
||||||
|
"""
|
||||||
|
Find all files under 'dir' and return the list of full filenames.
|
||||||
|
Unless dir is '.', return full filenames with dir prepended.
|
||||||
|
"""
|
||||||
|
files = _find_all_simple(dir)
|
||||||
|
if dir == os.curdir:
|
||||||
|
make_rel = functools.partial(os.path.relpath, start=dir)
|
||||||
|
files = map(make_rel, files)
|
||||||
|
return list(files)
|
||||||
|
|
||||||
|
|
||||||
|
monkey.patch_all()
|
||||||
@@ -0,0 +1,21 @@
|
|||||||
|
# This file is dual licensed under the terms of the Apache License, Version
|
||||||
|
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
|
||||||
|
# for complete details.
|
||||||
|
from __future__ import absolute_import, division, print_function
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
"__title__", "__summary__", "__uri__", "__version__", "__author__",
|
||||||
|
"__email__", "__license__", "__copyright__",
|
||||||
|
]
|
||||||
|
|
||||||
|
__title__ = "packaging"
|
||||||
|
__summary__ = "Core utilities for Python packages"
|
||||||
|
__uri__ = "https://github.com/pypa/packaging"
|
||||||
|
|
||||||
|
__version__ = "16.8"
|
||||||
|
|
||||||
|
__author__ = "Donald Stufft and individual contributors"
|
||||||
|
__email__ = "donald@stufft.io"
|
||||||
|
|
||||||
|
__license__ = "BSD or Apache License, Version 2.0"
|
||||||
|
__copyright__ = "Copyright 2014-2016 %s" % __author__
|
||||||
@@ -0,0 +1,14 @@
|
|||||||
|
# This file is dual licensed under the terms of the Apache License, Version
|
||||||
|
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
|
||||||
|
# for complete details.
|
||||||
|
from __future__ import absolute_import, division, print_function
|
||||||
|
|
||||||
|
from .__about__ import (
|
||||||
|
__author__, __copyright__, __email__, __license__, __summary__, __title__,
|
||||||
|
__uri__, __version__
|
||||||
|
)
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
"__title__", "__summary__", "__uri__", "__version__", "__author__",
|
||||||
|
"__email__", "__license__", "__copyright__",
|
||||||
|
]
|
||||||
@@ -0,0 +1,30 @@
|
|||||||
|
# This file is dual licensed under the terms of the Apache License, Version
|
||||||
|
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
|
||||||
|
# for complete details.
|
||||||
|
from __future__ import absolute_import, division, print_function
|
||||||
|
|
||||||
|
import sys
|
||||||
|
|
||||||
|
|
||||||
|
PY2 = sys.version_info[0] == 2
|
||||||
|
PY3 = sys.version_info[0] == 3
|
||||||
|
|
||||||
|
# flake8: noqa
|
||||||
|
|
||||||
|
if PY3:
|
||||||
|
string_types = str,
|
||||||
|
else:
|
||||||
|
string_types = basestring,
|
||||||
|
|
||||||
|
|
||||||
|
def with_metaclass(meta, *bases):
|
||||||
|
"""
|
||||||
|
Create a base class with a metaclass.
|
||||||
|
"""
|
||||||
|
# This requires a bit of explanation: the basic idea is to make a dummy
|
||||||
|
# metaclass for one level of class instantiation that replaces itself with
|
||||||
|
# the actual metaclass.
|
||||||
|
class metaclass(meta):
|
||||||
|
def __new__(cls, name, this_bases, d):
|
||||||
|
return meta(name, bases, d)
|
||||||
|
return type.__new__(metaclass, 'temporary_class', (), {})
|
||||||
@@ -0,0 +1,68 @@
|
|||||||
|
# This file is dual licensed under the terms of the Apache License, Version
|
||||||
|
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
|
||||||
|
# for complete details.
|
||||||
|
from __future__ import absolute_import, division, print_function
|
||||||
|
|
||||||
|
|
||||||
|
class Infinity(object):
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return "Infinity"
|
||||||
|
|
||||||
|
def __hash__(self):
|
||||||
|
return hash(repr(self))
|
||||||
|
|
||||||
|
def __lt__(self, other):
|
||||||
|
return False
|
||||||
|
|
||||||
|
def __le__(self, other):
|
||||||
|
return False
|
||||||
|
|
||||||
|
def __eq__(self, other):
|
||||||
|
return isinstance(other, self.__class__)
|
||||||
|
|
||||||
|
def __ne__(self, other):
|
||||||
|
return not isinstance(other, self.__class__)
|
||||||
|
|
||||||
|
def __gt__(self, other):
|
||||||
|
return True
|
||||||
|
|
||||||
|
def __ge__(self, other):
|
||||||
|
return True
|
||||||
|
|
||||||
|
def __neg__(self):
|
||||||
|
return NegativeInfinity
|
||||||
|
|
||||||
|
Infinity = Infinity()
|
||||||
|
|
||||||
|
|
||||||
|
class NegativeInfinity(object):
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return "-Infinity"
|
||||||
|
|
||||||
|
def __hash__(self):
|
||||||
|
return hash(repr(self))
|
||||||
|
|
||||||
|
def __lt__(self, other):
|
||||||
|
return True
|
||||||
|
|
||||||
|
def __le__(self, other):
|
||||||
|
return True
|
||||||
|
|
||||||
|
def __eq__(self, other):
|
||||||
|
return isinstance(other, self.__class__)
|
||||||
|
|
||||||
|
def __ne__(self, other):
|
||||||
|
return not isinstance(other, self.__class__)
|
||||||
|
|
||||||
|
def __gt__(self, other):
|
||||||
|
return False
|
||||||
|
|
||||||
|
def __ge__(self, other):
|
||||||
|
return False
|
||||||
|
|
||||||
|
def __neg__(self):
|
||||||
|
return Infinity
|
||||||
|
|
||||||
|
NegativeInfinity = NegativeInfinity()
|
||||||
@@ -0,0 +1,301 @@
|
|||||||
|
# This file is dual licensed under the terms of the Apache License, Version
|
||||||
|
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
|
||||||
|
# for complete details.
|
||||||
|
from __future__ import absolute_import, division, print_function
|
||||||
|
|
||||||
|
import operator
|
||||||
|
import os
|
||||||
|
import platform
|
||||||
|
import sys
|
||||||
|
|
||||||
|
from setuptools.extern.pyparsing import ParseException, ParseResults, stringStart, stringEnd
|
||||||
|
from setuptools.extern.pyparsing import ZeroOrMore, Group, Forward, QuotedString
|
||||||
|
from setuptools.extern.pyparsing import Literal as L # noqa
|
||||||
|
|
||||||
|
from ._compat import string_types
|
||||||
|
from .specifiers import Specifier, InvalidSpecifier
|
||||||
|
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
"InvalidMarker", "UndefinedComparison", "UndefinedEnvironmentName",
|
||||||
|
"Marker", "default_environment",
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
class InvalidMarker(ValueError):
|
||||||
|
"""
|
||||||
|
An invalid marker was found, users should refer to PEP 508.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class UndefinedComparison(ValueError):
|
||||||
|
"""
|
||||||
|
An invalid operation was attempted on a value that doesn't support it.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class UndefinedEnvironmentName(ValueError):
|
||||||
|
"""
|
||||||
|
A name was attempted to be used that does not exist inside of the
|
||||||
|
environment.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class Node(object):
|
||||||
|
|
||||||
|
def __init__(self, value):
|
||||||
|
self.value = value
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return str(self.value)
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return "<{0}({1!r})>".format(self.__class__.__name__, str(self))
|
||||||
|
|
||||||
|
def serialize(self):
|
||||||
|
raise NotImplementedError
|
||||||
|
|
||||||
|
|
||||||
|
class Variable(Node):
|
||||||
|
|
||||||
|
def serialize(self):
|
||||||
|
return str(self)
|
||||||
|
|
||||||
|
|
||||||
|
class Value(Node):
|
||||||
|
|
||||||
|
def serialize(self):
|
||||||
|
return '"{0}"'.format(self)
|
||||||
|
|
||||||
|
|
||||||
|
class Op(Node):
|
||||||
|
|
||||||
|
def serialize(self):
|
||||||
|
return str(self)
|
||||||
|
|
||||||
|
|
||||||
|
VARIABLE = (
|
||||||
|
L("implementation_version") |
|
||||||
|
L("platform_python_implementation") |
|
||||||
|
L("implementation_name") |
|
||||||
|
L("python_full_version") |
|
||||||
|
L("platform_release") |
|
||||||
|
L("platform_version") |
|
||||||
|
L("platform_machine") |
|
||||||
|
L("platform_system") |
|
||||||
|
L("python_version") |
|
||||||
|
L("sys_platform") |
|
||||||
|
L("os_name") |
|
||||||
|
L("os.name") | # PEP-345
|
||||||
|
L("sys.platform") | # PEP-345
|
||||||
|
L("platform.version") | # PEP-345
|
||||||
|
L("platform.machine") | # PEP-345
|
||||||
|
L("platform.python_implementation") | # PEP-345
|
||||||
|
L("python_implementation") | # undocumented setuptools legacy
|
||||||
|
L("extra")
|
||||||
|
)
|
||||||
|
ALIASES = {
|
||||||
|
'os.name': 'os_name',
|
||||||
|
'sys.platform': 'sys_platform',
|
||||||
|
'platform.version': 'platform_version',
|
||||||
|
'platform.machine': 'platform_machine',
|
||||||
|
'platform.python_implementation': 'platform_python_implementation',
|
||||||
|
'python_implementation': 'platform_python_implementation'
|
||||||
|
}
|
||||||
|
VARIABLE.setParseAction(lambda s, l, t: Variable(ALIASES.get(t[0], t[0])))
|
||||||
|
|
||||||
|
VERSION_CMP = (
|
||||||
|
L("===") |
|
||||||
|
L("==") |
|
||||||
|
L(">=") |
|
||||||
|
L("<=") |
|
||||||
|
L("!=") |
|
||||||
|
L("~=") |
|
||||||
|
L(">") |
|
||||||
|
L("<")
|
||||||
|
)
|
||||||
|
|
||||||
|
MARKER_OP = VERSION_CMP | L("not in") | L("in")
|
||||||
|
MARKER_OP.setParseAction(lambda s, l, t: Op(t[0]))
|
||||||
|
|
||||||
|
MARKER_VALUE = QuotedString("'") | QuotedString('"')
|
||||||
|
MARKER_VALUE.setParseAction(lambda s, l, t: Value(t[0]))
|
||||||
|
|
||||||
|
BOOLOP = L("and") | L("or")
|
||||||
|
|
||||||
|
MARKER_VAR = VARIABLE | MARKER_VALUE
|
||||||
|
|
||||||
|
MARKER_ITEM = Group(MARKER_VAR + MARKER_OP + MARKER_VAR)
|
||||||
|
MARKER_ITEM.setParseAction(lambda s, l, t: tuple(t[0]))
|
||||||
|
|
||||||
|
LPAREN = L("(").suppress()
|
||||||
|
RPAREN = L(")").suppress()
|
||||||
|
|
||||||
|
MARKER_EXPR = Forward()
|
||||||
|
MARKER_ATOM = MARKER_ITEM | Group(LPAREN + MARKER_EXPR + RPAREN)
|
||||||
|
MARKER_EXPR << MARKER_ATOM + ZeroOrMore(BOOLOP + MARKER_EXPR)
|
||||||
|
|
||||||
|
MARKER = stringStart + MARKER_EXPR + stringEnd
|
||||||
|
|
||||||
|
|
||||||
|
def _coerce_parse_result(results):
|
||||||
|
if isinstance(results, ParseResults):
|
||||||
|
return [_coerce_parse_result(i) for i in results]
|
||||||
|
else:
|
||||||
|
return results
|
||||||
|
|
||||||
|
|
||||||
|
def _format_marker(marker, first=True):
|
||||||
|
assert isinstance(marker, (list, tuple, string_types))
|
||||||
|
|
||||||
|
# Sometimes we have a structure like [[...]] which is a single item list
|
||||||
|
# where the single item is itself it's own list. In that case we want skip
|
||||||
|
# the rest of this function so that we don't get extraneous () on the
|
||||||
|
# outside.
|
||||||
|
if (isinstance(marker, list) and len(marker) == 1 and
|
||||||
|
isinstance(marker[0], (list, tuple))):
|
||||||
|
return _format_marker(marker[0])
|
||||||
|
|
||||||
|
if isinstance(marker, list):
|
||||||
|
inner = (_format_marker(m, first=False) for m in marker)
|
||||||
|
if first:
|
||||||
|
return " ".join(inner)
|
||||||
|
else:
|
||||||
|
return "(" + " ".join(inner) + ")"
|
||||||
|
elif isinstance(marker, tuple):
|
||||||
|
return " ".join([m.serialize() for m in marker])
|
||||||
|
else:
|
||||||
|
return marker
|
||||||
|
|
||||||
|
|
||||||
|
_operators = {
|
||||||
|
"in": lambda lhs, rhs: lhs in rhs,
|
||||||
|
"not in": lambda lhs, rhs: lhs not in rhs,
|
||||||
|
"<": operator.lt,
|
||||||
|
"<=": operator.le,
|
||||||
|
"==": operator.eq,
|
||||||
|
"!=": operator.ne,
|
||||||
|
">=": operator.ge,
|
||||||
|
">": operator.gt,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def _eval_op(lhs, op, rhs):
|
||||||
|
try:
|
||||||
|
spec = Specifier("".join([op.serialize(), rhs]))
|
||||||
|
except InvalidSpecifier:
|
||||||
|
pass
|
||||||
|
else:
|
||||||
|
return spec.contains(lhs)
|
||||||
|
|
||||||
|
oper = _operators.get(op.serialize())
|
||||||
|
if oper is None:
|
||||||
|
raise UndefinedComparison(
|
||||||
|
"Undefined {0!r} on {1!r} and {2!r}.".format(op, lhs, rhs)
|
||||||
|
)
|
||||||
|
|
||||||
|
return oper(lhs, rhs)
|
||||||
|
|
||||||
|
|
||||||
|
_undefined = object()
|
||||||
|
|
||||||
|
|
||||||
|
def _get_env(environment, name):
|
||||||
|
value = environment.get(name, _undefined)
|
||||||
|
|
||||||
|
if value is _undefined:
|
||||||
|
raise UndefinedEnvironmentName(
|
||||||
|
"{0!r} does not exist in evaluation environment.".format(name)
|
||||||
|
)
|
||||||
|
|
||||||
|
return value
|
||||||
|
|
||||||
|
|
||||||
|
def _evaluate_markers(markers, environment):
|
||||||
|
groups = [[]]
|
||||||
|
|
||||||
|
for marker in markers:
|
||||||
|
assert isinstance(marker, (list, tuple, string_types))
|
||||||
|
|
||||||
|
if isinstance(marker, list):
|
||||||
|
groups[-1].append(_evaluate_markers(marker, environment))
|
||||||
|
elif isinstance(marker, tuple):
|
||||||
|
lhs, op, rhs = marker
|
||||||
|
|
||||||
|
if isinstance(lhs, Variable):
|
||||||
|
lhs_value = _get_env(environment, lhs.value)
|
||||||
|
rhs_value = rhs.value
|
||||||
|
else:
|
||||||
|
lhs_value = lhs.value
|
||||||
|
rhs_value = _get_env(environment, rhs.value)
|
||||||
|
|
||||||
|
groups[-1].append(_eval_op(lhs_value, op, rhs_value))
|
||||||
|
else:
|
||||||
|
assert marker in ["and", "or"]
|
||||||
|
if marker == "or":
|
||||||
|
groups.append([])
|
||||||
|
|
||||||
|
return any(all(item) for item in groups)
|
||||||
|
|
||||||
|
|
||||||
|
def format_full_version(info):
|
||||||
|
version = '{0.major}.{0.minor}.{0.micro}'.format(info)
|
||||||
|
kind = info.releaselevel
|
||||||
|
if kind != 'final':
|
||||||
|
version += kind[0] + str(info.serial)
|
||||||
|
return version
|
||||||
|
|
||||||
|
|
||||||
|
def default_environment():
|
||||||
|
if hasattr(sys, 'implementation'):
|
||||||
|
iver = format_full_version(sys.implementation.version)
|
||||||
|
implementation_name = sys.implementation.name
|
||||||
|
else:
|
||||||
|
iver = '0'
|
||||||
|
implementation_name = ''
|
||||||
|
|
||||||
|
return {
|
||||||
|
"implementation_name": implementation_name,
|
||||||
|
"implementation_version": iver,
|
||||||
|
"os_name": os.name,
|
||||||
|
"platform_machine": platform.machine(),
|
||||||
|
"platform_release": platform.release(),
|
||||||
|
"platform_system": platform.system(),
|
||||||
|
"platform_version": platform.version(),
|
||||||
|
"python_full_version": platform.python_version(),
|
||||||
|
"platform_python_implementation": platform.python_implementation(),
|
||||||
|
"python_version": platform.python_version()[:3],
|
||||||
|
"sys_platform": sys.platform,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
class Marker(object):
|
||||||
|
|
||||||
|
def __init__(self, marker):
|
||||||
|
try:
|
||||||
|
self._markers = _coerce_parse_result(MARKER.parseString(marker))
|
||||||
|
except ParseException as e:
|
||||||
|
err_str = "Invalid marker: {0!r}, parse error at {1!r}".format(
|
||||||
|
marker, marker[e.loc:e.loc + 8])
|
||||||
|
raise InvalidMarker(err_str)
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return _format_marker(self._markers)
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return "<Marker({0!r})>".format(str(self))
|
||||||
|
|
||||||
|
def evaluate(self, environment=None):
|
||||||
|
"""Evaluate a marker.
|
||||||
|
|
||||||
|
Return the boolean from evaluating the given marker against the
|
||||||
|
environment. environment is an optional argument to override all or
|
||||||
|
part of the determined environment.
|
||||||
|
|
||||||
|
The environment is determined from the current Python process.
|
||||||
|
"""
|
||||||
|
current_environment = default_environment()
|
||||||
|
if environment is not None:
|
||||||
|
current_environment.update(environment)
|
||||||
|
|
||||||
|
return _evaluate_markers(self._markers, current_environment)
|
||||||
@@ -0,0 +1,127 @@
|
|||||||
|
# This file is dual licensed under the terms of the Apache License, Version
|
||||||
|
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
|
||||||
|
# for complete details.
|
||||||
|
from __future__ import absolute_import, division, print_function
|
||||||
|
|
||||||
|
import string
|
||||||
|
import re
|
||||||
|
|
||||||
|
from setuptools.extern.pyparsing import stringStart, stringEnd, originalTextFor, ParseException
|
||||||
|
from setuptools.extern.pyparsing import ZeroOrMore, Word, Optional, Regex, Combine
|
||||||
|
from setuptools.extern.pyparsing import Literal as L # noqa
|
||||||
|
from setuptools.extern.six.moves.urllib import parse as urlparse
|
||||||
|
|
||||||
|
from .markers import MARKER_EXPR, Marker
|
||||||
|
from .specifiers import LegacySpecifier, Specifier, SpecifierSet
|
||||||
|
|
||||||
|
|
||||||
|
class InvalidRequirement(ValueError):
|
||||||
|
"""
|
||||||
|
An invalid requirement was found, users should refer to PEP 508.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
ALPHANUM = Word(string.ascii_letters + string.digits)
|
||||||
|
|
||||||
|
LBRACKET = L("[").suppress()
|
||||||
|
RBRACKET = L("]").suppress()
|
||||||
|
LPAREN = L("(").suppress()
|
||||||
|
RPAREN = L(")").suppress()
|
||||||
|
COMMA = L(",").suppress()
|
||||||
|
SEMICOLON = L(";").suppress()
|
||||||
|
AT = L("@").suppress()
|
||||||
|
|
||||||
|
PUNCTUATION = Word("-_.")
|
||||||
|
IDENTIFIER_END = ALPHANUM | (ZeroOrMore(PUNCTUATION) + ALPHANUM)
|
||||||
|
IDENTIFIER = Combine(ALPHANUM + ZeroOrMore(IDENTIFIER_END))
|
||||||
|
|
||||||
|
NAME = IDENTIFIER("name")
|
||||||
|
EXTRA = IDENTIFIER
|
||||||
|
|
||||||
|
URI = Regex(r'[^ ]+')("url")
|
||||||
|
URL = (AT + URI)
|
||||||
|
|
||||||
|
EXTRAS_LIST = EXTRA + ZeroOrMore(COMMA + EXTRA)
|
||||||
|
EXTRAS = (LBRACKET + Optional(EXTRAS_LIST) + RBRACKET)("extras")
|
||||||
|
|
||||||
|
VERSION_PEP440 = Regex(Specifier._regex_str, re.VERBOSE | re.IGNORECASE)
|
||||||
|
VERSION_LEGACY = Regex(LegacySpecifier._regex_str, re.VERBOSE | re.IGNORECASE)
|
||||||
|
|
||||||
|
VERSION_ONE = VERSION_PEP440 ^ VERSION_LEGACY
|
||||||
|
VERSION_MANY = Combine(VERSION_ONE + ZeroOrMore(COMMA + VERSION_ONE),
|
||||||
|
joinString=",", adjacent=False)("_raw_spec")
|
||||||
|
_VERSION_SPEC = Optional(((LPAREN + VERSION_MANY + RPAREN) | VERSION_MANY))
|
||||||
|
_VERSION_SPEC.setParseAction(lambda s, l, t: t._raw_spec or '')
|
||||||
|
|
||||||
|
VERSION_SPEC = originalTextFor(_VERSION_SPEC)("specifier")
|
||||||
|
VERSION_SPEC.setParseAction(lambda s, l, t: t[1])
|
||||||
|
|
||||||
|
MARKER_EXPR = originalTextFor(MARKER_EXPR())("marker")
|
||||||
|
MARKER_EXPR.setParseAction(
|
||||||
|
lambda s, l, t: Marker(s[t._original_start:t._original_end])
|
||||||
|
)
|
||||||
|
MARKER_SEPERATOR = SEMICOLON
|
||||||
|
MARKER = MARKER_SEPERATOR + MARKER_EXPR
|
||||||
|
|
||||||
|
VERSION_AND_MARKER = VERSION_SPEC + Optional(MARKER)
|
||||||
|
URL_AND_MARKER = URL + Optional(MARKER)
|
||||||
|
|
||||||
|
NAMED_REQUIREMENT = \
|
||||||
|
NAME + Optional(EXTRAS) + (URL_AND_MARKER | VERSION_AND_MARKER)
|
||||||
|
|
||||||
|
REQUIREMENT = stringStart + NAMED_REQUIREMENT + stringEnd
|
||||||
|
|
||||||
|
|
||||||
|
class Requirement(object):
|
||||||
|
"""Parse a requirement.
|
||||||
|
|
||||||
|
Parse a given requirement string into its parts, such as name, specifier,
|
||||||
|
URL, and extras. Raises InvalidRequirement on a badly-formed requirement
|
||||||
|
string.
|
||||||
|
"""
|
||||||
|
|
||||||
|
# TODO: Can we test whether something is contained within a requirement?
|
||||||
|
# If so how do we do that? Do we need to test against the _name_ of
|
||||||
|
# the thing as well as the version? What about the markers?
|
||||||
|
# TODO: Can we normalize the name and extra name?
|
||||||
|
|
||||||
|
def __init__(self, requirement_string):
|
||||||
|
try:
|
||||||
|
req = REQUIREMENT.parseString(requirement_string)
|
||||||
|
except ParseException as e:
|
||||||
|
raise InvalidRequirement(
|
||||||
|
"Invalid requirement, parse error at \"{0!r}\"".format(
|
||||||
|
requirement_string[e.loc:e.loc + 8]))
|
||||||
|
|
||||||
|
self.name = req.name
|
||||||
|
if req.url:
|
||||||
|
parsed_url = urlparse.urlparse(req.url)
|
||||||
|
if not (parsed_url.scheme and parsed_url.netloc) or (
|
||||||
|
not parsed_url.scheme and not parsed_url.netloc):
|
||||||
|
raise InvalidRequirement("Invalid URL given")
|
||||||
|
self.url = req.url
|
||||||
|
else:
|
||||||
|
self.url = None
|
||||||
|
self.extras = set(req.extras.asList() if req.extras else [])
|
||||||
|
self.specifier = SpecifierSet(req.specifier)
|
||||||
|
self.marker = req.marker if req.marker else None
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
parts = [self.name]
|
||||||
|
|
||||||
|
if self.extras:
|
||||||
|
parts.append("[{0}]".format(",".join(sorted(self.extras))))
|
||||||
|
|
||||||
|
if self.specifier:
|
||||||
|
parts.append(str(self.specifier))
|
||||||
|
|
||||||
|
if self.url:
|
||||||
|
parts.append("@ {0}".format(self.url))
|
||||||
|
|
||||||
|
if self.marker:
|
||||||
|
parts.append("; {0}".format(self.marker))
|
||||||
|
|
||||||
|
return "".join(parts)
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return "<Requirement({0!r})>".format(str(self))
|
||||||
@@ -0,0 +1,774 @@
|
|||||||
|
# This file is dual licensed under the terms of the Apache License, Version
|
||||||
|
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
|
||||||
|
# for complete details.
|
||||||
|
from __future__ import absolute_import, division, print_function
|
||||||
|
|
||||||
|
import abc
|
||||||
|
import functools
|
||||||
|
import itertools
|
||||||
|
import re
|
||||||
|
|
||||||
|
from ._compat import string_types, with_metaclass
|
||||||
|
from .version import Version, LegacyVersion, parse
|
||||||
|
|
||||||
|
|
||||||
|
class InvalidSpecifier(ValueError):
|
||||||
|
"""
|
||||||
|
An invalid specifier was found, users should refer to PEP 440.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class BaseSpecifier(with_metaclass(abc.ABCMeta, object)):
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def __str__(self):
|
||||||
|
"""
|
||||||
|
Returns the str representation of this Specifier like object. This
|
||||||
|
should be representative of the Specifier itself.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def __hash__(self):
|
||||||
|
"""
|
||||||
|
Returns a hash value for this Specifier like object.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def __eq__(self, other):
|
||||||
|
"""
|
||||||
|
Returns a boolean representing whether or not the two Specifier like
|
||||||
|
objects are equal.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def __ne__(self, other):
|
||||||
|
"""
|
||||||
|
Returns a boolean representing whether or not the two Specifier like
|
||||||
|
objects are not equal.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@abc.abstractproperty
|
||||||
|
def prereleases(self):
|
||||||
|
"""
|
||||||
|
Returns whether or not pre-releases as a whole are allowed by this
|
||||||
|
specifier.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@prereleases.setter
|
||||||
|
def prereleases(self, value):
|
||||||
|
"""
|
||||||
|
Sets whether or not pre-releases as a whole are allowed by this
|
||||||
|
specifier.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def contains(self, item, prereleases=None):
|
||||||
|
"""
|
||||||
|
Determines if the given item is contained within this specifier.
|
||||||
|
"""
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def filter(self, iterable, prereleases=None):
|
||||||
|
"""
|
||||||
|
Takes an iterable of items and filters them so that only items which
|
||||||
|
are contained within this specifier are allowed in it.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class _IndividualSpecifier(BaseSpecifier):
|
||||||
|
|
||||||
|
_operators = {}
|
||||||
|
|
||||||
|
def __init__(self, spec="", prereleases=None):
|
||||||
|
match = self._regex.search(spec)
|
||||||
|
if not match:
|
||||||
|
raise InvalidSpecifier("Invalid specifier: '{0}'".format(spec))
|
||||||
|
|
||||||
|
self._spec = (
|
||||||
|
match.group("operator").strip(),
|
||||||
|
match.group("version").strip(),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Store whether or not this Specifier should accept prereleases
|
||||||
|
self._prereleases = prereleases
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
pre = (
|
||||||
|
", prereleases={0!r}".format(self.prereleases)
|
||||||
|
if self._prereleases is not None
|
||||||
|
else ""
|
||||||
|
)
|
||||||
|
|
||||||
|
return "<{0}({1!r}{2})>".format(
|
||||||
|
self.__class__.__name__,
|
||||||
|
str(self),
|
||||||
|
pre,
|
||||||
|
)
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return "{0}{1}".format(*self._spec)
|
||||||
|
|
||||||
|
def __hash__(self):
|
||||||
|
return hash(self._spec)
|
||||||
|
|
||||||
|
def __eq__(self, other):
|
||||||
|
if isinstance(other, string_types):
|
||||||
|
try:
|
||||||
|
other = self.__class__(other)
|
||||||
|
except InvalidSpecifier:
|
||||||
|
return NotImplemented
|
||||||
|
elif not isinstance(other, self.__class__):
|
||||||
|
return NotImplemented
|
||||||
|
|
||||||
|
return self._spec == other._spec
|
||||||
|
|
||||||
|
def __ne__(self, other):
|
||||||
|
if isinstance(other, string_types):
|
||||||
|
try:
|
||||||
|
other = self.__class__(other)
|
||||||
|
except InvalidSpecifier:
|
||||||
|
return NotImplemented
|
||||||
|
elif not isinstance(other, self.__class__):
|
||||||
|
return NotImplemented
|
||||||
|
|
||||||
|
return self._spec != other._spec
|
||||||
|
|
||||||
|
def _get_operator(self, op):
|
||||||
|
return getattr(self, "_compare_{0}".format(self._operators[op]))
|
||||||
|
|
||||||
|
def _coerce_version(self, version):
|
||||||
|
if not isinstance(version, (LegacyVersion, Version)):
|
||||||
|
version = parse(version)
|
||||||
|
return version
|
||||||
|
|
||||||
|
@property
|
||||||
|
def operator(self):
|
||||||
|
return self._spec[0]
|
||||||
|
|
||||||
|
@property
|
||||||
|
def version(self):
|
||||||
|
return self._spec[1]
|
||||||
|
|
||||||
|
@property
|
||||||
|
def prereleases(self):
|
||||||
|
return self._prereleases
|
||||||
|
|
||||||
|
@prereleases.setter
|
||||||
|
def prereleases(self, value):
|
||||||
|
self._prereleases = value
|
||||||
|
|
||||||
|
def __contains__(self, item):
|
||||||
|
return self.contains(item)
|
||||||
|
|
||||||
|
def contains(self, item, prereleases=None):
|
||||||
|
# Determine if prereleases are to be allowed or not.
|
||||||
|
if prereleases is None:
|
||||||
|
prereleases = self.prereleases
|
||||||
|
|
||||||
|
# Normalize item to a Version or LegacyVersion, this allows us to have
|
||||||
|
# a shortcut for ``"2.0" in Specifier(">=2")
|
||||||
|
item = self._coerce_version(item)
|
||||||
|
|
||||||
|
# Determine if we should be supporting prereleases in this specifier
|
||||||
|
# or not, if we do not support prereleases than we can short circuit
|
||||||
|
# logic if this version is a prereleases.
|
||||||
|
if item.is_prerelease and not prereleases:
|
||||||
|
return False
|
||||||
|
|
||||||
|
# Actually do the comparison to determine if this item is contained
|
||||||
|
# within this Specifier or not.
|
||||||
|
return self._get_operator(self.operator)(item, self.version)
|
||||||
|
|
||||||
|
def filter(self, iterable, prereleases=None):
|
||||||
|
yielded = False
|
||||||
|
found_prereleases = []
|
||||||
|
|
||||||
|
kw = {"prereleases": prereleases if prereleases is not None else True}
|
||||||
|
|
||||||
|
# Attempt to iterate over all the values in the iterable and if any of
|
||||||
|
# them match, yield them.
|
||||||
|
for version in iterable:
|
||||||
|
parsed_version = self._coerce_version(version)
|
||||||
|
|
||||||
|
if self.contains(parsed_version, **kw):
|
||||||
|
# If our version is a prerelease, and we were not set to allow
|
||||||
|
# prereleases, then we'll store it for later incase nothing
|
||||||
|
# else matches this specifier.
|
||||||
|
if (parsed_version.is_prerelease and not
|
||||||
|
(prereleases or self.prereleases)):
|
||||||
|
found_prereleases.append(version)
|
||||||
|
# Either this is not a prerelease, or we should have been
|
||||||
|
# accepting prereleases from the begining.
|
||||||
|
else:
|
||||||
|
yielded = True
|
||||||
|
yield version
|
||||||
|
|
||||||
|
# Now that we've iterated over everything, determine if we've yielded
|
||||||
|
# any values, and if we have not and we have any prereleases stored up
|
||||||
|
# then we will go ahead and yield the prereleases.
|
||||||
|
if not yielded and found_prereleases:
|
||||||
|
for version in found_prereleases:
|
||||||
|
yield version
|
||||||
|
|
||||||
|
|
||||||
|
class LegacySpecifier(_IndividualSpecifier):
|
||||||
|
|
||||||
|
_regex_str = (
|
||||||
|
r"""
|
||||||
|
(?P<operator>(==|!=|<=|>=|<|>))
|
||||||
|
\s*
|
||||||
|
(?P<version>
|
||||||
|
[^,;\s)]* # Since this is a "legacy" specifier, and the version
|
||||||
|
# string can be just about anything, we match everything
|
||||||
|
# except for whitespace, a semi-colon for marker support,
|
||||||
|
# a closing paren since versions can be enclosed in
|
||||||
|
# them, and a comma since it's a version separator.
|
||||||
|
)
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
|
||||||
|
_regex = re.compile(
|
||||||
|
r"^\s*" + _regex_str + r"\s*$", re.VERBOSE | re.IGNORECASE)
|
||||||
|
|
||||||
|
_operators = {
|
||||||
|
"==": "equal",
|
||||||
|
"!=": "not_equal",
|
||||||
|
"<=": "less_than_equal",
|
||||||
|
">=": "greater_than_equal",
|
||||||
|
"<": "less_than",
|
||||||
|
">": "greater_than",
|
||||||
|
}
|
||||||
|
|
||||||
|
def _coerce_version(self, version):
|
||||||
|
if not isinstance(version, LegacyVersion):
|
||||||
|
version = LegacyVersion(str(version))
|
||||||
|
return version
|
||||||
|
|
||||||
|
def _compare_equal(self, prospective, spec):
|
||||||
|
return prospective == self._coerce_version(spec)
|
||||||
|
|
||||||
|
def _compare_not_equal(self, prospective, spec):
|
||||||
|
return prospective != self._coerce_version(spec)
|
||||||
|
|
||||||
|
def _compare_less_than_equal(self, prospective, spec):
|
||||||
|
return prospective <= self._coerce_version(spec)
|
||||||
|
|
||||||
|
def _compare_greater_than_equal(self, prospective, spec):
|
||||||
|
return prospective >= self._coerce_version(spec)
|
||||||
|
|
||||||
|
def _compare_less_than(self, prospective, spec):
|
||||||
|
return prospective < self._coerce_version(spec)
|
||||||
|
|
||||||
|
def _compare_greater_than(self, prospective, spec):
|
||||||
|
return prospective > self._coerce_version(spec)
|
||||||
|
|
||||||
|
|
||||||
|
def _require_version_compare(fn):
|
||||||
|
@functools.wraps(fn)
|
||||||
|
def wrapped(self, prospective, spec):
|
||||||
|
if not isinstance(prospective, Version):
|
||||||
|
return False
|
||||||
|
return fn(self, prospective, spec)
|
||||||
|
return wrapped
|
||||||
|
|
||||||
|
|
||||||
|
class Specifier(_IndividualSpecifier):
|
||||||
|
|
||||||
|
_regex_str = (
|
||||||
|
r"""
|
||||||
|
(?P<operator>(~=|==|!=|<=|>=|<|>|===))
|
||||||
|
(?P<version>
|
||||||
|
(?:
|
||||||
|
# The identity operators allow for an escape hatch that will
|
||||||
|
# do an exact string match of the version you wish to install.
|
||||||
|
# This will not be parsed by PEP 440 and we cannot determine
|
||||||
|
# any semantic meaning from it. This operator is discouraged
|
||||||
|
# but included entirely as an escape hatch.
|
||||||
|
(?<====) # Only match for the identity operator
|
||||||
|
\s*
|
||||||
|
[^\s]* # We just match everything, except for whitespace
|
||||||
|
# since we are only testing for strict identity.
|
||||||
|
)
|
||||||
|
|
|
||||||
|
(?:
|
||||||
|
# The (non)equality operators allow for wild card and local
|
||||||
|
# versions to be specified so we have to define these two
|
||||||
|
# operators separately to enable that.
|
||||||
|
(?<===|!=) # Only match for equals and not equals
|
||||||
|
|
||||||
|
\s*
|
||||||
|
v?
|
||||||
|
(?:[0-9]+!)? # epoch
|
||||||
|
[0-9]+(?:\.[0-9]+)* # release
|
||||||
|
(?: # pre release
|
||||||
|
[-_\.]?
|
||||||
|
(a|b|c|rc|alpha|beta|pre|preview)
|
||||||
|
[-_\.]?
|
||||||
|
[0-9]*
|
||||||
|
)?
|
||||||
|
(?: # post release
|
||||||
|
(?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
|
||||||
|
)?
|
||||||
|
|
||||||
|
# You cannot use a wild card and a dev or local version
|
||||||
|
# together so group them with a | and make them optional.
|
||||||
|
(?:
|
||||||
|
(?:[-_\.]?dev[-_\.]?[0-9]*)? # dev release
|
||||||
|
(?:\+[a-z0-9]+(?:[-_\.][a-z0-9]+)*)? # local
|
||||||
|
|
|
||||||
|
\.\* # Wild card syntax of .*
|
||||||
|
)?
|
||||||
|
)
|
||||||
|
|
|
||||||
|
(?:
|
||||||
|
# The compatible operator requires at least two digits in the
|
||||||
|
# release segment.
|
||||||
|
(?<=~=) # Only match for the compatible operator
|
||||||
|
|
||||||
|
\s*
|
||||||
|
v?
|
||||||
|
(?:[0-9]+!)? # epoch
|
||||||
|
[0-9]+(?:\.[0-9]+)+ # release (We have a + instead of a *)
|
||||||
|
(?: # pre release
|
||||||
|
[-_\.]?
|
||||||
|
(a|b|c|rc|alpha|beta|pre|preview)
|
||||||
|
[-_\.]?
|
||||||
|
[0-9]*
|
||||||
|
)?
|
||||||
|
(?: # post release
|
||||||
|
(?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
|
||||||
|
)?
|
||||||
|
(?:[-_\.]?dev[-_\.]?[0-9]*)? # dev release
|
||||||
|
)
|
||||||
|
|
|
||||||
|
(?:
|
||||||
|
# All other operators only allow a sub set of what the
|
||||||
|
# (non)equality operators do. Specifically they do not allow
|
||||||
|
# local versions to be specified nor do they allow the prefix
|
||||||
|
# matching wild cards.
|
||||||
|
(?<!==|!=|~=) # We have special cases for these
|
||||||
|
# operators so we want to make sure they
|
||||||
|
# don't match here.
|
||||||
|
|
||||||
|
\s*
|
||||||
|
v?
|
||||||
|
(?:[0-9]+!)? # epoch
|
||||||
|
[0-9]+(?:\.[0-9]+)* # release
|
||||||
|
(?: # pre release
|
||||||
|
[-_\.]?
|
||||||
|
(a|b|c|rc|alpha|beta|pre|preview)
|
||||||
|
[-_\.]?
|
||||||
|
[0-9]*
|
||||||
|
)?
|
||||||
|
(?: # post release
|
||||||
|
(?:-[0-9]+)|(?:[-_\.]?(post|rev|r)[-_\.]?[0-9]*)
|
||||||
|
)?
|
||||||
|
(?:[-_\.]?dev[-_\.]?[0-9]*)? # dev release
|
||||||
|
)
|
||||||
|
)
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
|
||||||
|
_regex = re.compile(
|
||||||
|
r"^\s*" + _regex_str + r"\s*$", re.VERBOSE | re.IGNORECASE)
|
||||||
|
|
||||||
|
_operators = {
|
||||||
|
"~=": "compatible",
|
||||||
|
"==": "equal",
|
||||||
|
"!=": "not_equal",
|
||||||
|
"<=": "less_than_equal",
|
||||||
|
">=": "greater_than_equal",
|
||||||
|
"<": "less_than",
|
||||||
|
">": "greater_than",
|
||||||
|
"===": "arbitrary",
|
||||||
|
}
|
||||||
|
|
||||||
|
@_require_version_compare
|
||||||
|
def _compare_compatible(self, prospective, spec):
|
||||||
|
# Compatible releases have an equivalent combination of >= and ==. That
|
||||||
|
# is that ~=2.2 is equivalent to >=2.2,==2.*. This allows us to
|
||||||
|
# implement this in terms of the other specifiers instead of
|
||||||
|
# implementing it ourselves. The only thing we need to do is construct
|
||||||
|
# the other specifiers.
|
||||||
|
|
||||||
|
# We want everything but the last item in the version, but we want to
|
||||||
|
# ignore post and dev releases and we want to treat the pre-release as
|
||||||
|
# it's own separate segment.
|
||||||
|
prefix = ".".join(
|
||||||
|
list(
|
||||||
|
itertools.takewhile(
|
||||||
|
lambda x: (not x.startswith("post") and not
|
||||||
|
x.startswith("dev")),
|
||||||
|
_version_split(spec),
|
||||||
|
)
|
||||||
|
)[:-1]
|
||||||
|
)
|
||||||
|
|
||||||
|
# Add the prefix notation to the end of our string
|
||||||
|
prefix += ".*"
|
||||||
|
|
||||||
|
return (self._get_operator(">=")(prospective, spec) and
|
||||||
|
self._get_operator("==")(prospective, prefix))
|
||||||
|
|
||||||
|
@_require_version_compare
|
||||||
|
def _compare_equal(self, prospective, spec):
|
||||||
|
# We need special logic to handle prefix matching
|
||||||
|
if spec.endswith(".*"):
|
||||||
|
# In the case of prefix matching we want to ignore local segment.
|
||||||
|
prospective = Version(prospective.public)
|
||||||
|
# Split the spec out by dots, and pretend that there is an implicit
|
||||||
|
# dot in between a release segment and a pre-release segment.
|
||||||
|
spec = _version_split(spec[:-2]) # Remove the trailing .*
|
||||||
|
|
||||||
|
# Split the prospective version out by dots, and pretend that there
|
||||||
|
# is an implicit dot in between a release segment and a pre-release
|
||||||
|
# segment.
|
||||||
|
prospective = _version_split(str(prospective))
|
||||||
|
|
||||||
|
# Shorten the prospective version to be the same length as the spec
|
||||||
|
# so that we can determine if the specifier is a prefix of the
|
||||||
|
# prospective version or not.
|
||||||
|
prospective = prospective[:len(spec)]
|
||||||
|
|
||||||
|
# Pad out our two sides with zeros so that they both equal the same
|
||||||
|
# length.
|
||||||
|
spec, prospective = _pad_version(spec, prospective)
|
||||||
|
else:
|
||||||
|
# Convert our spec string into a Version
|
||||||
|
spec = Version(spec)
|
||||||
|
|
||||||
|
# If the specifier does not have a local segment, then we want to
|
||||||
|
# act as if the prospective version also does not have a local
|
||||||
|
# segment.
|
||||||
|
if not spec.local:
|
||||||
|
prospective = Version(prospective.public)
|
||||||
|
|
||||||
|
return prospective == spec
|
||||||
|
|
||||||
|
@_require_version_compare
|
||||||
|
def _compare_not_equal(self, prospective, spec):
|
||||||
|
return not self._compare_equal(prospective, spec)
|
||||||
|
|
||||||
|
@_require_version_compare
|
||||||
|
def _compare_less_than_equal(self, prospective, spec):
|
||||||
|
return prospective <= Version(spec)
|
||||||
|
|
||||||
|
@_require_version_compare
|
||||||
|
def _compare_greater_than_equal(self, prospective, spec):
|
||||||
|
return prospective >= Version(spec)
|
||||||
|
|
||||||
|
@_require_version_compare
|
||||||
|
def _compare_less_than(self, prospective, spec):
|
||||||
|
# Convert our spec to a Version instance, since we'll want to work with
|
||||||
|
# it as a version.
|
||||||
|
spec = Version(spec)
|
||||||
|
|
||||||
|
# Check to see if the prospective version is less than the spec
|
||||||
|
# version. If it's not we can short circuit and just return False now
|
||||||
|
# instead of doing extra unneeded work.
|
||||||
|
if not prospective < spec:
|
||||||
|
return False
|
||||||
|
|
||||||
|
# This special case is here so that, unless the specifier itself
|
||||||
|
# includes is a pre-release version, that we do not accept pre-release
|
||||||
|
# versions for the version mentioned in the specifier (e.g. <3.1 should
|
||||||
|
# not match 3.1.dev0, but should match 3.0.dev0).
|
||||||
|
if not spec.is_prerelease and prospective.is_prerelease:
|
||||||
|
if Version(prospective.base_version) == Version(spec.base_version):
|
||||||
|
return False
|
||||||
|
|
||||||
|
# If we've gotten to here, it means that prospective version is both
|
||||||
|
# less than the spec version *and* it's not a pre-release of the same
|
||||||
|
# version in the spec.
|
||||||
|
return True
|
||||||
|
|
||||||
|
@_require_version_compare
|
||||||
|
def _compare_greater_than(self, prospective, spec):
|
||||||
|
# Convert our spec to a Version instance, since we'll want to work with
|
||||||
|
# it as a version.
|
||||||
|
spec = Version(spec)
|
||||||
|
|
||||||
|
# Check to see if the prospective version is greater than the spec
|
||||||
|
# version. If it's not we can short circuit and just return False now
|
||||||
|
# instead of doing extra unneeded work.
|
||||||
|
if not prospective > spec:
|
||||||
|
return False
|
||||||
|
|
||||||
|
# This special case is here so that, unless the specifier itself
|
||||||
|
# includes is a post-release version, that we do not accept
|
||||||
|
# post-release versions for the version mentioned in the specifier
|
||||||
|
# (e.g. >3.1 should not match 3.0.post0, but should match 3.2.post0).
|
||||||
|
if not spec.is_postrelease and prospective.is_postrelease:
|
||||||
|
if Version(prospective.base_version) == Version(spec.base_version):
|
||||||
|
return False
|
||||||
|
|
||||||
|
# Ensure that we do not allow a local version of the version mentioned
|
||||||
|
# in the specifier, which is techincally greater than, to match.
|
||||||
|
if prospective.local is not None:
|
||||||
|
if Version(prospective.base_version) == Version(spec.base_version):
|
||||||
|
return False
|
||||||
|
|
||||||
|
# If we've gotten to here, it means that prospective version is both
|
||||||
|
# greater than the spec version *and* it's not a pre-release of the
|
||||||
|
# same version in the spec.
|
||||||
|
return True
|
||||||
|
|
||||||
|
def _compare_arbitrary(self, prospective, spec):
|
||||||
|
return str(prospective).lower() == str(spec).lower()
|
||||||
|
|
||||||
|
@property
|
||||||
|
def prereleases(self):
|
||||||
|
# If there is an explicit prereleases set for this, then we'll just
|
||||||
|
# blindly use that.
|
||||||
|
if self._prereleases is not None:
|
||||||
|
return self._prereleases
|
||||||
|
|
||||||
|
# Look at all of our specifiers and determine if they are inclusive
|
||||||
|
# operators, and if they are if they are including an explicit
|
||||||
|
# prerelease.
|
||||||
|
operator, version = self._spec
|
||||||
|
if operator in ["==", ">=", "<=", "~=", "==="]:
|
||||||
|
# The == specifier can include a trailing .*, if it does we
|
||||||
|
# want to remove before parsing.
|
||||||
|
if operator == "==" and version.endswith(".*"):
|
||||||
|
version = version[:-2]
|
||||||
|
|
||||||
|
# Parse the version, and if it is a pre-release than this
|
||||||
|
# specifier allows pre-releases.
|
||||||
|
if parse(version).is_prerelease:
|
||||||
|
return True
|
||||||
|
|
||||||
|
return False
|
||||||
|
|
||||||
|
@prereleases.setter
|
||||||
|
def prereleases(self, value):
|
||||||
|
self._prereleases = value
|
||||||
|
|
||||||
|
|
||||||
|
_prefix_regex = re.compile(r"^([0-9]+)((?:a|b|c|rc)[0-9]+)$")
|
||||||
|
|
||||||
|
|
||||||
|
def _version_split(version):
|
||||||
|
result = []
|
||||||
|
for item in version.split("."):
|
||||||
|
match = _prefix_regex.search(item)
|
||||||
|
if match:
|
||||||
|
result.extend(match.groups())
|
||||||
|
else:
|
||||||
|
result.append(item)
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def _pad_version(left, right):
|
||||||
|
left_split, right_split = [], []
|
||||||
|
|
||||||
|
# Get the release segment of our versions
|
||||||
|
left_split.append(list(itertools.takewhile(lambda x: x.isdigit(), left)))
|
||||||
|
right_split.append(list(itertools.takewhile(lambda x: x.isdigit(), right)))
|
||||||
|
|
||||||
|
# Get the rest of our versions
|
||||||
|
left_split.append(left[len(left_split[0]):])
|
||||||
|
right_split.append(right[len(right_split[0]):])
|
||||||
|
|
||||||
|
# Insert our padding
|
||||||
|
left_split.insert(
|
||||||
|
1,
|
||||||
|
["0"] * max(0, len(right_split[0]) - len(left_split[0])),
|
||||||
|
)
|
||||||
|
right_split.insert(
|
||||||
|
1,
|
||||||
|
["0"] * max(0, len(left_split[0]) - len(right_split[0])),
|
||||||
|
)
|
||||||
|
|
||||||
|
return (
|
||||||
|
list(itertools.chain(*left_split)),
|
||||||
|
list(itertools.chain(*right_split)),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class SpecifierSet(BaseSpecifier):
|
||||||
|
|
||||||
|
def __init__(self, specifiers="", prereleases=None):
|
||||||
|
# Split on , to break each indidivual specifier into it's own item, and
|
||||||
|
# strip each item to remove leading/trailing whitespace.
|
||||||
|
specifiers = [s.strip() for s in specifiers.split(",") if s.strip()]
|
||||||
|
|
||||||
|
# Parsed each individual specifier, attempting first to make it a
|
||||||
|
# Specifier and falling back to a LegacySpecifier.
|
||||||
|
parsed = set()
|
||||||
|
for specifier in specifiers:
|
||||||
|
try:
|
||||||
|
parsed.add(Specifier(specifier))
|
||||||
|
except InvalidSpecifier:
|
||||||
|
parsed.add(LegacySpecifier(specifier))
|
||||||
|
|
||||||
|
# Turn our parsed specifiers into a frozen set and save them for later.
|
||||||
|
self._specs = frozenset(parsed)
|
||||||
|
|
||||||
|
# Store our prereleases value so we can use it later to determine if
|
||||||
|
# we accept prereleases or not.
|
||||||
|
self._prereleases = prereleases
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
pre = (
|
||||||
|
", prereleases={0!r}".format(self.prereleases)
|
||||||
|
if self._prereleases is not None
|
||||||
|
else ""
|
||||||
|
)
|
||||||
|
|
||||||
|
return "<SpecifierSet({0!r}{1})>".format(str(self), pre)
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return ",".join(sorted(str(s) for s in self._specs))
|
||||||
|
|
||||||
|
def __hash__(self):
|
||||||
|
return hash(self._specs)
|
||||||
|
|
||||||
|
def __and__(self, other):
|
||||||
|
if isinstance(other, string_types):
|
||||||
|
other = SpecifierSet(other)
|
||||||
|
elif not isinstance(other, SpecifierSet):
|
||||||
|
return NotImplemented
|
||||||
|
|
||||||
|
specifier = SpecifierSet()
|
||||||
|
specifier._specs = frozenset(self._specs | other._specs)
|
||||||
|
|
||||||
|
if self._prereleases is None and other._prereleases is not None:
|
||||||
|
specifier._prereleases = other._prereleases
|
||||||
|
elif self._prereleases is not None and other._prereleases is None:
|
||||||
|
specifier._prereleases = self._prereleases
|
||||||
|
elif self._prereleases == other._prereleases:
|
||||||
|
specifier._prereleases = self._prereleases
|
||||||
|
else:
|
||||||
|
raise ValueError(
|
||||||
|
"Cannot combine SpecifierSets with True and False prerelease "
|
||||||
|
"overrides."
|
||||||
|
)
|
||||||
|
|
||||||
|
return specifier
|
||||||
|
|
||||||
|
def __eq__(self, other):
|
||||||
|
if isinstance(other, string_types):
|
||||||
|
other = SpecifierSet(other)
|
||||||
|
elif isinstance(other, _IndividualSpecifier):
|
||||||
|
other = SpecifierSet(str(other))
|
||||||
|
elif not isinstance(other, SpecifierSet):
|
||||||
|
return NotImplemented
|
||||||
|
|
||||||
|
return self._specs == other._specs
|
||||||
|
|
||||||
|
def __ne__(self, other):
|
||||||
|
if isinstance(other, string_types):
|
||||||
|
other = SpecifierSet(other)
|
||||||
|
elif isinstance(other, _IndividualSpecifier):
|
||||||
|
other = SpecifierSet(str(other))
|
||||||
|
elif not isinstance(other, SpecifierSet):
|
||||||
|
return NotImplemented
|
||||||
|
|
||||||
|
return self._specs != other._specs
|
||||||
|
|
||||||
|
def __len__(self):
|
||||||
|
return len(self._specs)
|
||||||
|
|
||||||
|
def __iter__(self):
|
||||||
|
return iter(self._specs)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def prereleases(self):
|
||||||
|
# If we have been given an explicit prerelease modifier, then we'll
|
||||||
|
# pass that through here.
|
||||||
|
if self._prereleases is not None:
|
||||||
|
return self._prereleases
|
||||||
|
|
||||||
|
# If we don't have any specifiers, and we don't have a forced value,
|
||||||
|
# then we'll just return None since we don't know if this should have
|
||||||
|
# pre-releases or not.
|
||||||
|
if not self._specs:
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Otherwise we'll see if any of the given specifiers accept
|
||||||
|
# prereleases, if any of them do we'll return True, otherwise False.
|
||||||
|
return any(s.prereleases for s in self._specs)
|
||||||
|
|
||||||
|
@prereleases.setter
|
||||||
|
def prereleases(self, value):
|
||||||
|
self._prereleases = value
|
||||||
|
|
||||||
|
def __contains__(self, item):
|
||||||
|
return self.contains(item)
|
||||||
|
|
||||||
|
def contains(self, item, prereleases=None):
|
||||||
|
# Ensure that our item is a Version or LegacyVersion instance.
|
||||||
|
if not isinstance(item, (LegacyVersion, Version)):
|
||||||
|
item = parse(item)
|
||||||
|
|
||||||
|
# Determine if we're forcing a prerelease or not, if we're not forcing
|
||||||
|
# one for this particular filter call, then we'll use whatever the
|
||||||
|
# SpecifierSet thinks for whether or not we should support prereleases.
|
||||||
|
if prereleases is None:
|
||||||
|
prereleases = self.prereleases
|
||||||
|
|
||||||
|
# We can determine if we're going to allow pre-releases by looking to
|
||||||
|
# see if any of the underlying items supports them. If none of them do
|
||||||
|
# and this item is a pre-release then we do not allow it and we can
|
||||||
|
# short circuit that here.
|
||||||
|
# Note: This means that 1.0.dev1 would not be contained in something
|
||||||
|
# like >=1.0.devabc however it would be in >=1.0.debabc,>0.0.dev0
|
||||||
|
if not prereleases and item.is_prerelease:
|
||||||
|
return False
|
||||||
|
|
||||||
|
# We simply dispatch to the underlying specs here to make sure that the
|
||||||
|
# given version is contained within all of them.
|
||||||
|
# Note: This use of all() here means that an empty set of specifiers
|
||||||
|
# will always return True, this is an explicit design decision.
|
||||||
|
return all(
|
||||||
|
s.contains(item, prereleases=prereleases)
|
||||||
|
for s in self._specs
|
||||||
|
)
|
||||||
|
|
||||||
|
def filter(self, iterable, prereleases=None):
|
||||||
|
# Determine if we're forcing a prerelease or not, if we're not forcing
|
||||||
|
# one for this particular filter call, then we'll use whatever the
|
||||||
|
# SpecifierSet thinks for whether or not we should support prereleases.
|
||||||
|
if prereleases is None:
|
||||||
|
prereleases = self.prereleases
|
||||||
|
|
||||||
|
# If we have any specifiers, then we want to wrap our iterable in the
|
||||||
|
# filter method for each one, this will act as a logical AND amongst
|
||||||
|
# each specifier.
|
||||||
|
if self._specs:
|
||||||
|
for spec in self._specs:
|
||||||
|
iterable = spec.filter(iterable, prereleases=bool(prereleases))
|
||||||
|
return iterable
|
||||||
|
# If we do not have any specifiers, then we need to have a rough filter
|
||||||
|
# which will filter out any pre-releases, unless there are no final
|
||||||
|
# releases, and which will filter out LegacyVersion in general.
|
||||||
|
else:
|
||||||
|
filtered = []
|
||||||
|
found_prereleases = []
|
||||||
|
|
||||||
|
for item in iterable:
|
||||||
|
# Ensure that we some kind of Version class for this item.
|
||||||
|
if not isinstance(item, (LegacyVersion, Version)):
|
||||||
|
parsed_version = parse(item)
|
||||||
|
else:
|
||||||
|
parsed_version = item
|
||||||
|
|
||||||
|
# Filter out any item which is parsed as a LegacyVersion
|
||||||
|
if isinstance(parsed_version, LegacyVersion):
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Store any item which is a pre-release for later unless we've
|
||||||
|
# already found a final version or we are accepting prereleases
|
||||||
|
if parsed_version.is_prerelease and not prereleases:
|
||||||
|
if not filtered:
|
||||||
|
found_prereleases.append(item)
|
||||||
|
else:
|
||||||
|
filtered.append(item)
|
||||||
|
|
||||||
|
# If we've found no items except for pre-releases, then we'll go
|
||||||
|
# ahead and use the pre-releases
|
||||||
|
if not filtered and found_prereleases and prereleases is None:
|
||||||
|
return found_prereleases
|
||||||
|
|
||||||
|
return filtered
|
||||||
@@ -0,0 +1,14 @@
|
|||||||
|
# This file is dual licensed under the terms of the Apache License, Version
|
||||||
|
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
|
||||||
|
# for complete details.
|
||||||
|
from __future__ import absolute_import, division, print_function
|
||||||
|
|
||||||
|
import re
|
||||||
|
|
||||||
|
|
||||||
|
_canonicalize_regex = re.compile(r"[-_.]+")
|
||||||
|
|
||||||
|
|
||||||
|
def canonicalize_name(name):
|
||||||
|
# This is taken from PEP 503.
|
||||||
|
return _canonicalize_regex.sub("-", name).lower()
|
||||||
@@ -0,0 +1,393 @@
|
|||||||
|
# This file is dual licensed under the terms of the Apache License, Version
|
||||||
|
# 2.0, and the BSD License. See the LICENSE file in the root of this repository
|
||||||
|
# for complete details.
|
||||||
|
from __future__ import absolute_import, division, print_function
|
||||||
|
|
||||||
|
import collections
|
||||||
|
import itertools
|
||||||
|
import re
|
||||||
|
|
||||||
|
from ._structures import Infinity
|
||||||
|
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
"parse", "Version", "LegacyVersion", "InvalidVersion", "VERSION_PATTERN"
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
_Version = collections.namedtuple(
|
||||||
|
"_Version",
|
||||||
|
["epoch", "release", "dev", "pre", "post", "local"],
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def parse(version):
|
||||||
|
"""
|
||||||
|
Parse the given version string and return either a :class:`Version` object
|
||||||
|
or a :class:`LegacyVersion` object depending on if the given version is
|
||||||
|
a valid PEP 440 version or a legacy version.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
return Version(version)
|
||||||
|
except InvalidVersion:
|
||||||
|
return LegacyVersion(version)
|
||||||
|
|
||||||
|
|
||||||
|
class InvalidVersion(ValueError):
|
||||||
|
"""
|
||||||
|
An invalid version was found, users should refer to PEP 440.
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class _BaseVersion(object):
|
||||||
|
|
||||||
|
def __hash__(self):
|
||||||
|
return hash(self._key)
|
||||||
|
|
||||||
|
def __lt__(self, other):
|
||||||
|
return self._compare(other, lambda s, o: s < o)
|
||||||
|
|
||||||
|
def __le__(self, other):
|
||||||
|
return self._compare(other, lambda s, o: s <= o)
|
||||||
|
|
||||||
|
def __eq__(self, other):
|
||||||
|
return self._compare(other, lambda s, o: s == o)
|
||||||
|
|
||||||
|
def __ge__(self, other):
|
||||||
|
return self._compare(other, lambda s, o: s >= o)
|
||||||
|
|
||||||
|
def __gt__(self, other):
|
||||||
|
return self._compare(other, lambda s, o: s > o)
|
||||||
|
|
||||||
|
def __ne__(self, other):
|
||||||
|
return self._compare(other, lambda s, o: s != o)
|
||||||
|
|
||||||
|
def _compare(self, other, method):
|
||||||
|
if not isinstance(other, _BaseVersion):
|
||||||
|
return NotImplemented
|
||||||
|
|
||||||
|
return method(self._key, other._key)
|
||||||
|
|
||||||
|
|
||||||
|
class LegacyVersion(_BaseVersion):
|
||||||
|
|
||||||
|
def __init__(self, version):
|
||||||
|
self._version = str(version)
|
||||||
|
self._key = _legacy_cmpkey(self._version)
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return self._version
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return "<LegacyVersion({0})>".format(repr(str(self)))
|
||||||
|
|
||||||
|
@property
|
||||||
|
def public(self):
|
||||||
|
return self._version
|
||||||
|
|
||||||
|
@property
|
||||||
|
def base_version(self):
|
||||||
|
return self._version
|
||||||
|
|
||||||
|
@property
|
||||||
|
def local(self):
|
||||||
|
return None
|
||||||
|
|
||||||
|
@property
|
||||||
|
def is_prerelease(self):
|
||||||
|
return False
|
||||||
|
|
||||||
|
@property
|
||||||
|
def is_postrelease(self):
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
_legacy_version_component_re = re.compile(
|
||||||
|
r"(\d+ | [a-z]+ | \.| -)", re.VERBOSE,
|
||||||
|
)
|
||||||
|
|
||||||
|
_legacy_version_replacement_map = {
|
||||||
|
"pre": "c", "preview": "c", "-": "final-", "rc": "c", "dev": "@",
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def _parse_version_parts(s):
|
||||||
|
for part in _legacy_version_component_re.split(s):
|
||||||
|
part = _legacy_version_replacement_map.get(part, part)
|
||||||
|
|
||||||
|
if not part or part == ".":
|
||||||
|
continue
|
||||||
|
|
||||||
|
if part[:1] in "0123456789":
|
||||||
|
# pad for numeric comparison
|
||||||
|
yield part.zfill(8)
|
||||||
|
else:
|
||||||
|
yield "*" + part
|
||||||
|
|
||||||
|
# ensure that alpha/beta/candidate are before final
|
||||||
|
yield "*final"
|
||||||
|
|
||||||
|
|
||||||
|
def _legacy_cmpkey(version):
|
||||||
|
# We hardcode an epoch of -1 here. A PEP 440 version can only have a epoch
|
||||||
|
# greater than or equal to 0. This will effectively put the LegacyVersion,
|
||||||
|
# which uses the defacto standard originally implemented by setuptools,
|
||||||
|
# as before all PEP 440 versions.
|
||||||
|
epoch = -1
|
||||||
|
|
||||||
|
# This scheme is taken from pkg_resources.parse_version setuptools prior to
|
||||||
|
# it's adoption of the packaging library.
|
||||||
|
parts = []
|
||||||
|
for part in _parse_version_parts(version.lower()):
|
||||||
|
if part.startswith("*"):
|
||||||
|
# remove "-" before a prerelease tag
|
||||||
|
if part < "*final":
|
||||||
|
while parts and parts[-1] == "*final-":
|
||||||
|
parts.pop()
|
||||||
|
|
||||||
|
# remove trailing zeros from each series of numeric parts
|
||||||
|
while parts and parts[-1] == "00000000":
|
||||||
|
parts.pop()
|
||||||
|
|
||||||
|
parts.append(part)
|
||||||
|
parts = tuple(parts)
|
||||||
|
|
||||||
|
return epoch, parts
|
||||||
|
|
||||||
|
# Deliberately not anchored to the start and end of the string, to make it
|
||||||
|
# easier for 3rd party code to reuse
|
||||||
|
VERSION_PATTERN = r"""
|
||||||
|
v?
|
||||||
|
(?:
|
||||||
|
(?:(?P<epoch>[0-9]+)!)? # epoch
|
||||||
|
(?P<release>[0-9]+(?:\.[0-9]+)*) # release segment
|
||||||
|
(?P<pre> # pre-release
|
||||||
|
[-_\.]?
|
||||||
|
(?P<pre_l>(a|b|c|rc|alpha|beta|pre|preview))
|
||||||
|
[-_\.]?
|
||||||
|
(?P<pre_n>[0-9]+)?
|
||||||
|
)?
|
||||||
|
(?P<post> # post release
|
||||||
|
(?:-(?P<post_n1>[0-9]+))
|
||||||
|
|
|
||||||
|
(?:
|
||||||
|
[-_\.]?
|
||||||
|
(?P<post_l>post|rev|r)
|
||||||
|
[-_\.]?
|
||||||
|
(?P<post_n2>[0-9]+)?
|
||||||
|
)
|
||||||
|
)?
|
||||||
|
(?P<dev> # dev release
|
||||||
|
[-_\.]?
|
||||||
|
(?P<dev_l>dev)
|
||||||
|
[-_\.]?
|
||||||
|
(?P<dev_n>[0-9]+)?
|
||||||
|
)?
|
||||||
|
)
|
||||||
|
(?:\+(?P<local>[a-z0-9]+(?:[-_\.][a-z0-9]+)*))? # local version
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
class Version(_BaseVersion):
|
||||||
|
|
||||||
|
_regex = re.compile(
|
||||||
|
r"^\s*" + VERSION_PATTERN + r"\s*$",
|
||||||
|
re.VERBOSE | re.IGNORECASE,
|
||||||
|
)
|
||||||
|
|
||||||
|
def __init__(self, version):
|
||||||
|
# Validate the version and parse it into pieces
|
||||||
|
match = self._regex.search(version)
|
||||||
|
if not match:
|
||||||
|
raise InvalidVersion("Invalid version: '{0}'".format(version))
|
||||||
|
|
||||||
|
# Store the parsed out pieces of the version
|
||||||
|
self._version = _Version(
|
||||||
|
epoch=int(match.group("epoch")) if match.group("epoch") else 0,
|
||||||
|
release=tuple(int(i) for i in match.group("release").split(".")),
|
||||||
|
pre=_parse_letter_version(
|
||||||
|
match.group("pre_l"),
|
||||||
|
match.group("pre_n"),
|
||||||
|
),
|
||||||
|
post=_parse_letter_version(
|
||||||
|
match.group("post_l"),
|
||||||
|
match.group("post_n1") or match.group("post_n2"),
|
||||||
|
),
|
||||||
|
dev=_parse_letter_version(
|
||||||
|
match.group("dev_l"),
|
||||||
|
match.group("dev_n"),
|
||||||
|
),
|
||||||
|
local=_parse_local_version(match.group("local")),
|
||||||
|
)
|
||||||
|
|
||||||
|
# Generate a key which will be used for sorting
|
||||||
|
self._key = _cmpkey(
|
||||||
|
self._version.epoch,
|
||||||
|
self._version.release,
|
||||||
|
self._version.pre,
|
||||||
|
self._version.post,
|
||||||
|
self._version.dev,
|
||||||
|
self._version.local,
|
||||||
|
)
|
||||||
|
|
||||||
|
def __repr__(self):
|
||||||
|
return "<Version({0})>".format(repr(str(self)))
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
parts = []
|
||||||
|
|
||||||
|
# Epoch
|
||||||
|
if self._version.epoch != 0:
|
||||||
|
parts.append("{0}!".format(self._version.epoch))
|
||||||
|
|
||||||
|
# Release segment
|
||||||
|
parts.append(".".join(str(x) for x in self._version.release))
|
||||||
|
|
||||||
|
# Pre-release
|
||||||
|
if self._version.pre is not None:
|
||||||
|
parts.append("".join(str(x) for x in self._version.pre))
|
||||||
|
|
||||||
|
# Post-release
|
||||||
|
if self._version.post is not None:
|
||||||
|
parts.append(".post{0}".format(self._version.post[1]))
|
||||||
|
|
||||||
|
# Development release
|
||||||
|
if self._version.dev is not None:
|
||||||
|
parts.append(".dev{0}".format(self._version.dev[1]))
|
||||||
|
|
||||||
|
# Local version segment
|
||||||
|
if self._version.local is not None:
|
||||||
|
parts.append(
|
||||||
|
"+{0}".format(".".join(str(x) for x in self._version.local))
|
||||||
|
)
|
||||||
|
|
||||||
|
return "".join(parts)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def public(self):
|
||||||
|
return str(self).split("+", 1)[0]
|
||||||
|
|
||||||
|
@property
|
||||||
|
def base_version(self):
|
||||||
|
parts = []
|
||||||
|
|
||||||
|
# Epoch
|
||||||
|
if self._version.epoch != 0:
|
||||||
|
parts.append("{0}!".format(self._version.epoch))
|
||||||
|
|
||||||
|
# Release segment
|
||||||
|
parts.append(".".join(str(x) for x in self._version.release))
|
||||||
|
|
||||||
|
return "".join(parts)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def local(self):
|
||||||
|
version_string = str(self)
|
||||||
|
if "+" in version_string:
|
||||||
|
return version_string.split("+", 1)[1]
|
||||||
|
|
||||||
|
@property
|
||||||
|
def is_prerelease(self):
|
||||||
|
return bool(self._version.dev or self._version.pre)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def is_postrelease(self):
|
||||||
|
return bool(self._version.post)
|
||||||
|
|
||||||
|
|
||||||
|
def _parse_letter_version(letter, number):
|
||||||
|
if letter:
|
||||||
|
# We consider there to be an implicit 0 in a pre-release if there is
|
||||||
|
# not a numeral associated with it.
|
||||||
|
if number is None:
|
||||||
|
number = 0
|
||||||
|
|
||||||
|
# We normalize any letters to their lower case form
|
||||||
|
letter = letter.lower()
|
||||||
|
|
||||||
|
# We consider some words to be alternate spellings of other words and
|
||||||
|
# in those cases we want to normalize the spellings to our preferred
|
||||||
|
# spelling.
|
||||||
|
if letter == "alpha":
|
||||||
|
letter = "a"
|
||||||
|
elif letter == "beta":
|
||||||
|
letter = "b"
|
||||||
|
elif letter in ["c", "pre", "preview"]:
|
||||||
|
letter = "rc"
|
||||||
|
elif letter in ["rev", "r"]:
|
||||||
|
letter = "post"
|
||||||
|
|
||||||
|
return letter, int(number)
|
||||||
|
if not letter and number:
|
||||||
|
# We assume if we are given a number, but we are not given a letter
|
||||||
|
# then this is using the implicit post release syntax (e.g. 1.0-1)
|
||||||
|
letter = "post"
|
||||||
|
|
||||||
|
return letter, int(number)
|
||||||
|
|
||||||
|
|
||||||
|
_local_version_seperators = re.compile(r"[\._-]")
|
||||||
|
|
||||||
|
|
||||||
|
def _parse_local_version(local):
|
||||||
|
"""
|
||||||
|
Takes a string like abc.1.twelve and turns it into ("abc", 1, "twelve").
|
||||||
|
"""
|
||||||
|
if local is not None:
|
||||||
|
return tuple(
|
||||||
|
part.lower() if not part.isdigit() else int(part)
|
||||||
|
for part in _local_version_seperators.split(local)
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _cmpkey(epoch, release, pre, post, dev, local):
|
||||||
|
# When we compare a release version, we want to compare it with all of the
|
||||||
|
# trailing zeros removed. So we'll use a reverse the list, drop all the now
|
||||||
|
# leading zeros until we come to something non zero, then take the rest
|
||||||
|
# re-reverse it back into the correct order and make it a tuple and use
|
||||||
|
# that for our sorting key.
|
||||||
|
release = tuple(
|
||||||
|
reversed(list(
|
||||||
|
itertools.dropwhile(
|
||||||
|
lambda x: x == 0,
|
||||||
|
reversed(release),
|
||||||
|
)
|
||||||
|
))
|
||||||
|
)
|
||||||
|
|
||||||
|
# We need to "trick" the sorting algorithm to put 1.0.dev0 before 1.0a0.
|
||||||
|
# We'll do this by abusing the pre segment, but we _only_ want to do this
|
||||||
|
# if there is not a pre or a post segment. If we have one of those then
|
||||||
|
# the normal sorting rules will handle this case correctly.
|
||||||
|
if pre is None and post is None and dev is not None:
|
||||||
|
pre = -Infinity
|
||||||
|
# Versions without a pre-release (except as noted above) should sort after
|
||||||
|
# those with one.
|
||||||
|
elif pre is None:
|
||||||
|
pre = Infinity
|
||||||
|
|
||||||
|
# Versions without a post segment should sort before those with one.
|
||||||
|
if post is None:
|
||||||
|
post = -Infinity
|
||||||
|
|
||||||
|
# Versions without a development segment should sort after those with one.
|
||||||
|
if dev is None:
|
||||||
|
dev = Infinity
|
||||||
|
|
||||||
|
if local is None:
|
||||||
|
# Versions without a local segment should sort before those with one.
|
||||||
|
local = -Infinity
|
||||||
|
else:
|
||||||
|
# Versions with a local segment need that segment parsed to implement
|
||||||
|
# the sorting rules in PEP440.
|
||||||
|
# - Alpha numeric segments sort before numeric segments
|
||||||
|
# - Alpha numeric segments sort lexicographically
|
||||||
|
# - Numeric segments sort numerically
|
||||||
|
# - Shorter versions sort before longer versions when the prefixes
|
||||||
|
# match exactly
|
||||||
|
local = tuple(
|
||||||
|
(i, "") if isinstance(i, int) else (-Infinity, i)
|
||||||
|
for i in local
|
||||||
|
)
|
||||||
|
|
||||||
|
return epoch, release, pre, post, dev, local
|
||||||
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,868 @@
|
|||||||
|
"""Utilities for writing code that runs on Python 2 and 3"""
|
||||||
|
|
||||||
|
# Copyright (c) 2010-2015 Benjamin Peterson
|
||||||
|
#
|
||||||
|
# Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||||
|
# of this software and associated documentation files (the "Software"), to deal
|
||||||
|
# in the Software without restriction, including without limitation the rights
|
||||||
|
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||||
|
# copies of the Software, and to permit persons to whom the Software is
|
||||||
|
# furnished to do so, subject to the following conditions:
|
||||||
|
#
|
||||||
|
# The above copyright notice and this permission notice shall be included in all
|
||||||
|
# copies or substantial portions of the Software.
|
||||||
|
#
|
||||||
|
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||||
|
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||||
|
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||||
|
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||||
|
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||||
|
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||||
|
# SOFTWARE.
|
||||||
|
|
||||||
|
from __future__ import absolute_import
|
||||||
|
|
||||||
|
import functools
|
||||||
|
import itertools
|
||||||
|
import operator
|
||||||
|
import sys
|
||||||
|
import types
|
||||||
|
|
||||||
|
__author__ = "Benjamin Peterson <benjamin@python.org>"
|
||||||
|
__version__ = "1.10.0"
|
||||||
|
|
||||||
|
|
||||||
|
# Useful for very coarse version differentiation.
|
||||||
|
PY2 = sys.version_info[0] == 2
|
||||||
|
PY3 = sys.version_info[0] == 3
|
||||||
|
PY34 = sys.version_info[0:2] >= (3, 4)
|
||||||
|
|
||||||
|
if PY3:
|
||||||
|
string_types = str,
|
||||||
|
integer_types = int,
|
||||||
|
class_types = type,
|
||||||
|
text_type = str
|
||||||
|
binary_type = bytes
|
||||||
|
|
||||||
|
MAXSIZE = sys.maxsize
|
||||||
|
else:
|
||||||
|
string_types = basestring,
|
||||||
|
integer_types = (int, long)
|
||||||
|
class_types = (type, types.ClassType)
|
||||||
|
text_type = unicode
|
||||||
|
binary_type = str
|
||||||
|
|
||||||
|
if sys.platform.startswith("java"):
|
||||||
|
# Jython always uses 32 bits.
|
||||||
|
MAXSIZE = int((1 << 31) - 1)
|
||||||
|
else:
|
||||||
|
# It's possible to have sizeof(long) != sizeof(Py_ssize_t).
|
||||||
|
class X(object):
|
||||||
|
|
||||||
|
def __len__(self):
|
||||||
|
return 1 << 31
|
||||||
|
try:
|
||||||
|
len(X())
|
||||||
|
except OverflowError:
|
||||||
|
# 32-bit
|
||||||
|
MAXSIZE = int((1 << 31) - 1)
|
||||||
|
else:
|
||||||
|
# 64-bit
|
||||||
|
MAXSIZE = int((1 << 63) - 1)
|
||||||
|
del X
|
||||||
|
|
||||||
|
|
||||||
|
def _add_doc(func, doc):
|
||||||
|
"""Add documentation to a function."""
|
||||||
|
func.__doc__ = doc
|
||||||
|
|
||||||
|
|
||||||
|
def _import_module(name):
|
||||||
|
"""Import module, returning the module after the last dot."""
|
||||||
|
__import__(name)
|
||||||
|
return sys.modules[name]
|
||||||
|
|
||||||
|
|
||||||
|
class _LazyDescr(object):
|
||||||
|
|
||||||
|
def __init__(self, name):
|
||||||
|
self.name = name
|
||||||
|
|
||||||
|
def __get__(self, obj, tp):
|
||||||
|
result = self._resolve()
|
||||||
|
setattr(obj, self.name, result) # Invokes __set__.
|
||||||
|
try:
|
||||||
|
# This is a bit ugly, but it avoids running this again by
|
||||||
|
# removing this descriptor.
|
||||||
|
delattr(obj.__class__, self.name)
|
||||||
|
except AttributeError:
|
||||||
|
pass
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
class MovedModule(_LazyDescr):
|
||||||
|
|
||||||
|
def __init__(self, name, old, new=None):
|
||||||
|
super(MovedModule, self).__init__(name)
|
||||||
|
if PY3:
|
||||||
|
if new is None:
|
||||||
|
new = name
|
||||||
|
self.mod = new
|
||||||
|
else:
|
||||||
|
self.mod = old
|
||||||
|
|
||||||
|
def _resolve(self):
|
||||||
|
return _import_module(self.mod)
|
||||||
|
|
||||||
|
def __getattr__(self, attr):
|
||||||
|
_module = self._resolve()
|
||||||
|
value = getattr(_module, attr)
|
||||||
|
setattr(self, attr, value)
|
||||||
|
return value
|
||||||
|
|
||||||
|
|
||||||
|
class _LazyModule(types.ModuleType):
|
||||||
|
|
||||||
|
def __init__(self, name):
|
||||||
|
super(_LazyModule, self).__init__(name)
|
||||||
|
self.__doc__ = self.__class__.__doc__
|
||||||
|
|
||||||
|
def __dir__(self):
|
||||||
|
attrs = ["__doc__", "__name__"]
|
||||||
|
attrs += [attr.name for attr in self._moved_attributes]
|
||||||
|
return attrs
|
||||||
|
|
||||||
|
# Subclasses should override this
|
||||||
|
_moved_attributes = []
|
||||||
|
|
||||||
|
|
||||||
|
class MovedAttribute(_LazyDescr):
|
||||||
|
|
||||||
|
def __init__(self, name, old_mod, new_mod, old_attr=None, new_attr=None):
|
||||||
|
super(MovedAttribute, self).__init__(name)
|
||||||
|
if PY3:
|
||||||
|
if new_mod is None:
|
||||||
|
new_mod = name
|
||||||
|
self.mod = new_mod
|
||||||
|
if new_attr is None:
|
||||||
|
if old_attr is None:
|
||||||
|
new_attr = name
|
||||||
|
else:
|
||||||
|
new_attr = old_attr
|
||||||
|
self.attr = new_attr
|
||||||
|
else:
|
||||||
|
self.mod = old_mod
|
||||||
|
if old_attr is None:
|
||||||
|
old_attr = name
|
||||||
|
self.attr = old_attr
|
||||||
|
|
||||||
|
def _resolve(self):
|
||||||
|
module = _import_module(self.mod)
|
||||||
|
return getattr(module, self.attr)
|
||||||
|
|
||||||
|
|
||||||
|
class _SixMetaPathImporter(object):
|
||||||
|
|
||||||
|
"""
|
||||||
|
A meta path importer to import six.moves and its submodules.
|
||||||
|
|
||||||
|
This class implements a PEP302 finder and loader. It should be compatible
|
||||||
|
with Python 2.5 and all existing versions of Python3
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, six_module_name):
|
||||||
|
self.name = six_module_name
|
||||||
|
self.known_modules = {}
|
||||||
|
|
||||||
|
def _add_module(self, mod, *fullnames):
|
||||||
|
for fullname in fullnames:
|
||||||
|
self.known_modules[self.name + "." + fullname] = mod
|
||||||
|
|
||||||
|
def _get_module(self, fullname):
|
||||||
|
return self.known_modules[self.name + "." + fullname]
|
||||||
|
|
||||||
|
def find_module(self, fullname, path=None):
|
||||||
|
if fullname in self.known_modules:
|
||||||
|
return self
|
||||||
|
return None
|
||||||
|
|
||||||
|
def __get_module(self, fullname):
|
||||||
|
try:
|
||||||
|
return self.known_modules[fullname]
|
||||||
|
except KeyError:
|
||||||
|
raise ImportError("This loader does not know module " + fullname)
|
||||||
|
|
||||||
|
def load_module(self, fullname):
|
||||||
|
try:
|
||||||
|
# in case of a reload
|
||||||
|
return sys.modules[fullname]
|
||||||
|
except KeyError:
|
||||||
|
pass
|
||||||
|
mod = self.__get_module(fullname)
|
||||||
|
if isinstance(mod, MovedModule):
|
||||||
|
mod = mod._resolve()
|
||||||
|
else:
|
||||||
|
mod.__loader__ = self
|
||||||
|
sys.modules[fullname] = mod
|
||||||
|
return mod
|
||||||
|
|
||||||
|
def is_package(self, fullname):
|
||||||
|
"""
|
||||||
|
Return true, if the named module is a package.
|
||||||
|
|
||||||
|
We need this method to get correct spec objects with
|
||||||
|
Python 3.4 (see PEP451)
|
||||||
|
"""
|
||||||
|
return hasattr(self.__get_module(fullname), "__path__")
|
||||||
|
|
||||||
|
def get_code(self, fullname):
|
||||||
|
"""Return None
|
||||||
|
|
||||||
|
Required, if is_package is implemented"""
|
||||||
|
self.__get_module(fullname) # eventually raises ImportError
|
||||||
|
return None
|
||||||
|
get_source = get_code # same as get_code
|
||||||
|
|
||||||
|
_importer = _SixMetaPathImporter(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class _MovedItems(_LazyModule):
|
||||||
|
|
||||||
|
"""Lazy loading of moved objects"""
|
||||||
|
__path__ = [] # mark as package
|
||||||
|
|
||||||
|
|
||||||
|
_moved_attributes = [
|
||||||
|
MovedAttribute("cStringIO", "cStringIO", "io", "StringIO"),
|
||||||
|
MovedAttribute("filter", "itertools", "builtins", "ifilter", "filter"),
|
||||||
|
MovedAttribute("filterfalse", "itertools", "itertools", "ifilterfalse", "filterfalse"),
|
||||||
|
MovedAttribute("input", "__builtin__", "builtins", "raw_input", "input"),
|
||||||
|
MovedAttribute("intern", "__builtin__", "sys"),
|
||||||
|
MovedAttribute("map", "itertools", "builtins", "imap", "map"),
|
||||||
|
MovedAttribute("getcwd", "os", "os", "getcwdu", "getcwd"),
|
||||||
|
MovedAttribute("getcwdb", "os", "os", "getcwd", "getcwdb"),
|
||||||
|
MovedAttribute("range", "__builtin__", "builtins", "xrange", "range"),
|
||||||
|
MovedAttribute("reload_module", "__builtin__", "importlib" if PY34 else "imp", "reload"),
|
||||||
|
MovedAttribute("reduce", "__builtin__", "functools"),
|
||||||
|
MovedAttribute("shlex_quote", "pipes", "shlex", "quote"),
|
||||||
|
MovedAttribute("StringIO", "StringIO", "io"),
|
||||||
|
MovedAttribute("UserDict", "UserDict", "collections"),
|
||||||
|
MovedAttribute("UserList", "UserList", "collections"),
|
||||||
|
MovedAttribute("UserString", "UserString", "collections"),
|
||||||
|
MovedAttribute("xrange", "__builtin__", "builtins", "xrange", "range"),
|
||||||
|
MovedAttribute("zip", "itertools", "builtins", "izip", "zip"),
|
||||||
|
MovedAttribute("zip_longest", "itertools", "itertools", "izip_longest", "zip_longest"),
|
||||||
|
MovedModule("builtins", "__builtin__"),
|
||||||
|
MovedModule("configparser", "ConfigParser"),
|
||||||
|
MovedModule("copyreg", "copy_reg"),
|
||||||
|
MovedModule("dbm_gnu", "gdbm", "dbm.gnu"),
|
||||||
|
MovedModule("_dummy_thread", "dummy_thread", "_dummy_thread"),
|
||||||
|
MovedModule("http_cookiejar", "cookielib", "http.cookiejar"),
|
||||||
|
MovedModule("http_cookies", "Cookie", "http.cookies"),
|
||||||
|
MovedModule("html_entities", "htmlentitydefs", "html.entities"),
|
||||||
|
MovedModule("html_parser", "HTMLParser", "html.parser"),
|
||||||
|
MovedModule("http_client", "httplib", "http.client"),
|
||||||
|
MovedModule("email_mime_multipart", "email.MIMEMultipart", "email.mime.multipart"),
|
||||||
|
MovedModule("email_mime_nonmultipart", "email.MIMENonMultipart", "email.mime.nonmultipart"),
|
||||||
|
MovedModule("email_mime_text", "email.MIMEText", "email.mime.text"),
|
||||||
|
MovedModule("email_mime_base", "email.MIMEBase", "email.mime.base"),
|
||||||
|
MovedModule("BaseHTTPServer", "BaseHTTPServer", "http.server"),
|
||||||
|
MovedModule("CGIHTTPServer", "CGIHTTPServer", "http.server"),
|
||||||
|
MovedModule("SimpleHTTPServer", "SimpleHTTPServer", "http.server"),
|
||||||
|
MovedModule("cPickle", "cPickle", "pickle"),
|
||||||
|
MovedModule("queue", "Queue"),
|
||||||
|
MovedModule("reprlib", "repr"),
|
||||||
|
MovedModule("socketserver", "SocketServer"),
|
||||||
|
MovedModule("_thread", "thread", "_thread"),
|
||||||
|
MovedModule("tkinter", "Tkinter"),
|
||||||
|
MovedModule("tkinter_dialog", "Dialog", "tkinter.dialog"),
|
||||||
|
MovedModule("tkinter_filedialog", "FileDialog", "tkinter.filedialog"),
|
||||||
|
MovedModule("tkinter_scrolledtext", "ScrolledText", "tkinter.scrolledtext"),
|
||||||
|
MovedModule("tkinter_simpledialog", "SimpleDialog", "tkinter.simpledialog"),
|
||||||
|
MovedModule("tkinter_tix", "Tix", "tkinter.tix"),
|
||||||
|
MovedModule("tkinter_ttk", "ttk", "tkinter.ttk"),
|
||||||
|
MovedModule("tkinter_constants", "Tkconstants", "tkinter.constants"),
|
||||||
|
MovedModule("tkinter_dnd", "Tkdnd", "tkinter.dnd"),
|
||||||
|
MovedModule("tkinter_colorchooser", "tkColorChooser",
|
||||||
|
"tkinter.colorchooser"),
|
||||||
|
MovedModule("tkinter_commondialog", "tkCommonDialog",
|
||||||
|
"tkinter.commondialog"),
|
||||||
|
MovedModule("tkinter_tkfiledialog", "tkFileDialog", "tkinter.filedialog"),
|
||||||
|
MovedModule("tkinter_font", "tkFont", "tkinter.font"),
|
||||||
|
MovedModule("tkinter_messagebox", "tkMessageBox", "tkinter.messagebox"),
|
||||||
|
MovedModule("tkinter_tksimpledialog", "tkSimpleDialog",
|
||||||
|
"tkinter.simpledialog"),
|
||||||
|
MovedModule("urllib_parse", __name__ + ".moves.urllib_parse", "urllib.parse"),
|
||||||
|
MovedModule("urllib_error", __name__ + ".moves.urllib_error", "urllib.error"),
|
||||||
|
MovedModule("urllib", __name__ + ".moves.urllib", __name__ + ".moves.urllib"),
|
||||||
|
MovedModule("urllib_robotparser", "robotparser", "urllib.robotparser"),
|
||||||
|
MovedModule("xmlrpc_client", "xmlrpclib", "xmlrpc.client"),
|
||||||
|
MovedModule("xmlrpc_server", "SimpleXMLRPCServer", "xmlrpc.server"),
|
||||||
|
]
|
||||||
|
# Add windows specific modules.
|
||||||
|
if sys.platform == "win32":
|
||||||
|
_moved_attributes += [
|
||||||
|
MovedModule("winreg", "_winreg"),
|
||||||
|
]
|
||||||
|
|
||||||
|
for attr in _moved_attributes:
|
||||||
|
setattr(_MovedItems, attr.name, attr)
|
||||||
|
if isinstance(attr, MovedModule):
|
||||||
|
_importer._add_module(attr, "moves." + attr.name)
|
||||||
|
del attr
|
||||||
|
|
||||||
|
_MovedItems._moved_attributes = _moved_attributes
|
||||||
|
|
||||||
|
moves = _MovedItems(__name__ + ".moves")
|
||||||
|
_importer._add_module(moves, "moves")
|
||||||
|
|
||||||
|
|
||||||
|
class Module_six_moves_urllib_parse(_LazyModule):
|
||||||
|
|
||||||
|
"""Lazy loading of moved objects in six.moves.urllib_parse"""
|
||||||
|
|
||||||
|
|
||||||
|
_urllib_parse_moved_attributes = [
|
||||||
|
MovedAttribute("ParseResult", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("SplitResult", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("parse_qs", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("parse_qsl", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("urldefrag", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("urljoin", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("urlparse", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("urlsplit", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("urlunparse", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("urlunsplit", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("quote", "urllib", "urllib.parse"),
|
||||||
|
MovedAttribute("quote_plus", "urllib", "urllib.parse"),
|
||||||
|
MovedAttribute("unquote", "urllib", "urllib.parse"),
|
||||||
|
MovedAttribute("unquote_plus", "urllib", "urllib.parse"),
|
||||||
|
MovedAttribute("urlencode", "urllib", "urllib.parse"),
|
||||||
|
MovedAttribute("splitquery", "urllib", "urllib.parse"),
|
||||||
|
MovedAttribute("splittag", "urllib", "urllib.parse"),
|
||||||
|
MovedAttribute("splituser", "urllib", "urllib.parse"),
|
||||||
|
MovedAttribute("uses_fragment", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("uses_netloc", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("uses_params", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("uses_query", "urlparse", "urllib.parse"),
|
||||||
|
MovedAttribute("uses_relative", "urlparse", "urllib.parse"),
|
||||||
|
]
|
||||||
|
for attr in _urllib_parse_moved_attributes:
|
||||||
|
setattr(Module_six_moves_urllib_parse, attr.name, attr)
|
||||||
|
del attr
|
||||||
|
|
||||||
|
Module_six_moves_urllib_parse._moved_attributes = _urllib_parse_moved_attributes
|
||||||
|
|
||||||
|
_importer._add_module(Module_six_moves_urllib_parse(__name__ + ".moves.urllib_parse"),
|
||||||
|
"moves.urllib_parse", "moves.urllib.parse")
|
||||||
|
|
||||||
|
|
||||||
|
class Module_six_moves_urllib_error(_LazyModule):
|
||||||
|
|
||||||
|
"""Lazy loading of moved objects in six.moves.urllib_error"""
|
||||||
|
|
||||||
|
|
||||||
|
_urllib_error_moved_attributes = [
|
||||||
|
MovedAttribute("URLError", "urllib2", "urllib.error"),
|
||||||
|
MovedAttribute("HTTPError", "urllib2", "urllib.error"),
|
||||||
|
MovedAttribute("ContentTooShortError", "urllib", "urllib.error"),
|
||||||
|
]
|
||||||
|
for attr in _urllib_error_moved_attributes:
|
||||||
|
setattr(Module_six_moves_urllib_error, attr.name, attr)
|
||||||
|
del attr
|
||||||
|
|
||||||
|
Module_six_moves_urllib_error._moved_attributes = _urllib_error_moved_attributes
|
||||||
|
|
||||||
|
_importer._add_module(Module_six_moves_urllib_error(__name__ + ".moves.urllib.error"),
|
||||||
|
"moves.urllib_error", "moves.urllib.error")
|
||||||
|
|
||||||
|
|
||||||
|
class Module_six_moves_urllib_request(_LazyModule):
|
||||||
|
|
||||||
|
"""Lazy loading of moved objects in six.moves.urllib_request"""
|
||||||
|
|
||||||
|
|
||||||
|
_urllib_request_moved_attributes = [
|
||||||
|
MovedAttribute("urlopen", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("install_opener", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("build_opener", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("pathname2url", "urllib", "urllib.request"),
|
||||||
|
MovedAttribute("url2pathname", "urllib", "urllib.request"),
|
||||||
|
MovedAttribute("getproxies", "urllib", "urllib.request"),
|
||||||
|
MovedAttribute("Request", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("OpenerDirector", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPDefaultErrorHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPRedirectHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPCookieProcessor", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("ProxyHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("BaseHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPPasswordMgr", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPPasswordMgrWithDefaultRealm", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("AbstractBasicAuthHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPBasicAuthHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("ProxyBasicAuthHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("AbstractDigestAuthHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPDigestAuthHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("ProxyDigestAuthHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPSHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("FileHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("FTPHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("CacheFTPHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("UnknownHandler", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("HTTPErrorProcessor", "urllib2", "urllib.request"),
|
||||||
|
MovedAttribute("urlretrieve", "urllib", "urllib.request"),
|
||||||
|
MovedAttribute("urlcleanup", "urllib", "urllib.request"),
|
||||||
|
MovedAttribute("URLopener", "urllib", "urllib.request"),
|
||||||
|
MovedAttribute("FancyURLopener", "urllib", "urllib.request"),
|
||||||
|
MovedAttribute("proxy_bypass", "urllib", "urllib.request"),
|
||||||
|
]
|
||||||
|
for attr in _urllib_request_moved_attributes:
|
||||||
|
setattr(Module_six_moves_urllib_request, attr.name, attr)
|
||||||
|
del attr
|
||||||
|
|
||||||
|
Module_six_moves_urllib_request._moved_attributes = _urllib_request_moved_attributes
|
||||||
|
|
||||||
|
_importer._add_module(Module_six_moves_urllib_request(__name__ + ".moves.urllib.request"),
|
||||||
|
"moves.urllib_request", "moves.urllib.request")
|
||||||
|
|
||||||
|
|
||||||
|
class Module_six_moves_urllib_response(_LazyModule):
|
||||||
|
|
||||||
|
"""Lazy loading of moved objects in six.moves.urllib_response"""
|
||||||
|
|
||||||
|
|
||||||
|
_urllib_response_moved_attributes = [
|
||||||
|
MovedAttribute("addbase", "urllib", "urllib.response"),
|
||||||
|
MovedAttribute("addclosehook", "urllib", "urllib.response"),
|
||||||
|
MovedAttribute("addinfo", "urllib", "urllib.response"),
|
||||||
|
MovedAttribute("addinfourl", "urllib", "urllib.response"),
|
||||||
|
]
|
||||||
|
for attr in _urllib_response_moved_attributes:
|
||||||
|
setattr(Module_six_moves_urllib_response, attr.name, attr)
|
||||||
|
del attr
|
||||||
|
|
||||||
|
Module_six_moves_urllib_response._moved_attributes = _urllib_response_moved_attributes
|
||||||
|
|
||||||
|
_importer._add_module(Module_six_moves_urllib_response(__name__ + ".moves.urllib.response"),
|
||||||
|
"moves.urllib_response", "moves.urllib.response")
|
||||||
|
|
||||||
|
|
||||||
|
class Module_six_moves_urllib_robotparser(_LazyModule):
|
||||||
|
|
||||||
|
"""Lazy loading of moved objects in six.moves.urllib_robotparser"""
|
||||||
|
|
||||||
|
|
||||||
|
_urllib_robotparser_moved_attributes = [
|
||||||
|
MovedAttribute("RobotFileParser", "robotparser", "urllib.robotparser"),
|
||||||
|
]
|
||||||
|
for attr in _urllib_robotparser_moved_attributes:
|
||||||
|
setattr(Module_six_moves_urllib_robotparser, attr.name, attr)
|
||||||
|
del attr
|
||||||
|
|
||||||
|
Module_six_moves_urllib_robotparser._moved_attributes = _urllib_robotparser_moved_attributes
|
||||||
|
|
||||||
|
_importer._add_module(Module_six_moves_urllib_robotparser(__name__ + ".moves.urllib.robotparser"),
|
||||||
|
"moves.urllib_robotparser", "moves.urllib.robotparser")
|
||||||
|
|
||||||
|
|
||||||
|
class Module_six_moves_urllib(types.ModuleType):
|
||||||
|
|
||||||
|
"""Create a six.moves.urllib namespace that resembles the Python 3 namespace"""
|
||||||
|
__path__ = [] # mark as package
|
||||||
|
parse = _importer._get_module("moves.urllib_parse")
|
||||||
|
error = _importer._get_module("moves.urllib_error")
|
||||||
|
request = _importer._get_module("moves.urllib_request")
|
||||||
|
response = _importer._get_module("moves.urllib_response")
|
||||||
|
robotparser = _importer._get_module("moves.urllib_robotparser")
|
||||||
|
|
||||||
|
def __dir__(self):
|
||||||
|
return ['parse', 'error', 'request', 'response', 'robotparser']
|
||||||
|
|
||||||
|
_importer._add_module(Module_six_moves_urllib(__name__ + ".moves.urllib"),
|
||||||
|
"moves.urllib")
|
||||||
|
|
||||||
|
|
||||||
|
def add_move(move):
|
||||||
|
"""Add an item to six.moves."""
|
||||||
|
setattr(_MovedItems, move.name, move)
|
||||||
|
|
||||||
|
|
||||||
|
def remove_move(name):
|
||||||
|
"""Remove item from six.moves."""
|
||||||
|
try:
|
||||||
|
delattr(_MovedItems, name)
|
||||||
|
except AttributeError:
|
||||||
|
try:
|
||||||
|
del moves.__dict__[name]
|
||||||
|
except KeyError:
|
||||||
|
raise AttributeError("no such move, %r" % (name,))
|
||||||
|
|
||||||
|
|
||||||
|
if PY3:
|
||||||
|
_meth_func = "__func__"
|
||||||
|
_meth_self = "__self__"
|
||||||
|
|
||||||
|
_func_closure = "__closure__"
|
||||||
|
_func_code = "__code__"
|
||||||
|
_func_defaults = "__defaults__"
|
||||||
|
_func_globals = "__globals__"
|
||||||
|
else:
|
||||||
|
_meth_func = "im_func"
|
||||||
|
_meth_self = "im_self"
|
||||||
|
|
||||||
|
_func_closure = "func_closure"
|
||||||
|
_func_code = "func_code"
|
||||||
|
_func_defaults = "func_defaults"
|
||||||
|
_func_globals = "func_globals"
|
||||||
|
|
||||||
|
|
||||||
|
try:
|
||||||
|
advance_iterator = next
|
||||||
|
except NameError:
|
||||||
|
def advance_iterator(it):
|
||||||
|
return it.next()
|
||||||
|
next = advance_iterator
|
||||||
|
|
||||||
|
|
||||||
|
try:
|
||||||
|
callable = callable
|
||||||
|
except NameError:
|
||||||
|
def callable(obj):
|
||||||
|
return any("__call__" in klass.__dict__ for klass in type(obj).__mro__)
|
||||||
|
|
||||||
|
|
||||||
|
if PY3:
|
||||||
|
def get_unbound_function(unbound):
|
||||||
|
return unbound
|
||||||
|
|
||||||
|
create_bound_method = types.MethodType
|
||||||
|
|
||||||
|
def create_unbound_method(func, cls):
|
||||||
|
return func
|
||||||
|
|
||||||
|
Iterator = object
|
||||||
|
else:
|
||||||
|
def get_unbound_function(unbound):
|
||||||
|
return unbound.im_func
|
||||||
|
|
||||||
|
def create_bound_method(func, obj):
|
||||||
|
return types.MethodType(func, obj, obj.__class__)
|
||||||
|
|
||||||
|
def create_unbound_method(func, cls):
|
||||||
|
return types.MethodType(func, None, cls)
|
||||||
|
|
||||||
|
class Iterator(object):
|
||||||
|
|
||||||
|
def next(self):
|
||||||
|
return type(self).__next__(self)
|
||||||
|
|
||||||
|
callable = callable
|
||||||
|
_add_doc(get_unbound_function,
|
||||||
|
"""Get the function out of a possibly unbound function""")
|
||||||
|
|
||||||
|
|
||||||
|
get_method_function = operator.attrgetter(_meth_func)
|
||||||
|
get_method_self = operator.attrgetter(_meth_self)
|
||||||
|
get_function_closure = operator.attrgetter(_func_closure)
|
||||||
|
get_function_code = operator.attrgetter(_func_code)
|
||||||
|
get_function_defaults = operator.attrgetter(_func_defaults)
|
||||||
|
get_function_globals = operator.attrgetter(_func_globals)
|
||||||
|
|
||||||
|
|
||||||
|
if PY3:
|
||||||
|
def iterkeys(d, **kw):
|
||||||
|
return iter(d.keys(**kw))
|
||||||
|
|
||||||
|
def itervalues(d, **kw):
|
||||||
|
return iter(d.values(**kw))
|
||||||
|
|
||||||
|
def iteritems(d, **kw):
|
||||||
|
return iter(d.items(**kw))
|
||||||
|
|
||||||
|
def iterlists(d, **kw):
|
||||||
|
return iter(d.lists(**kw))
|
||||||
|
|
||||||
|
viewkeys = operator.methodcaller("keys")
|
||||||
|
|
||||||
|
viewvalues = operator.methodcaller("values")
|
||||||
|
|
||||||
|
viewitems = operator.methodcaller("items")
|
||||||
|
else:
|
||||||
|
def iterkeys(d, **kw):
|
||||||
|
return d.iterkeys(**kw)
|
||||||
|
|
||||||
|
def itervalues(d, **kw):
|
||||||
|
return d.itervalues(**kw)
|
||||||
|
|
||||||
|
def iteritems(d, **kw):
|
||||||
|
return d.iteritems(**kw)
|
||||||
|
|
||||||
|
def iterlists(d, **kw):
|
||||||
|
return d.iterlists(**kw)
|
||||||
|
|
||||||
|
viewkeys = operator.methodcaller("viewkeys")
|
||||||
|
|
||||||
|
viewvalues = operator.methodcaller("viewvalues")
|
||||||
|
|
||||||
|
viewitems = operator.methodcaller("viewitems")
|
||||||
|
|
||||||
|
_add_doc(iterkeys, "Return an iterator over the keys of a dictionary.")
|
||||||
|
_add_doc(itervalues, "Return an iterator over the values of a dictionary.")
|
||||||
|
_add_doc(iteritems,
|
||||||
|
"Return an iterator over the (key, value) pairs of a dictionary.")
|
||||||
|
_add_doc(iterlists,
|
||||||
|
"Return an iterator over the (key, [values]) pairs of a dictionary.")
|
||||||
|
|
||||||
|
|
||||||
|
if PY3:
|
||||||
|
def b(s):
|
||||||
|
return s.encode("latin-1")
|
||||||
|
|
||||||
|
def u(s):
|
||||||
|
return s
|
||||||
|
unichr = chr
|
||||||
|
import struct
|
||||||
|
int2byte = struct.Struct(">B").pack
|
||||||
|
del struct
|
||||||
|
byte2int = operator.itemgetter(0)
|
||||||
|
indexbytes = operator.getitem
|
||||||
|
iterbytes = iter
|
||||||
|
import io
|
||||||
|
StringIO = io.StringIO
|
||||||
|
BytesIO = io.BytesIO
|
||||||
|
_assertCountEqual = "assertCountEqual"
|
||||||
|
if sys.version_info[1] <= 1:
|
||||||
|
_assertRaisesRegex = "assertRaisesRegexp"
|
||||||
|
_assertRegex = "assertRegexpMatches"
|
||||||
|
else:
|
||||||
|
_assertRaisesRegex = "assertRaisesRegex"
|
||||||
|
_assertRegex = "assertRegex"
|
||||||
|
else:
|
||||||
|
def b(s):
|
||||||
|
return s
|
||||||
|
# Workaround for standalone backslash
|
||||||
|
|
||||||
|
def u(s):
|
||||||
|
return unicode(s.replace(r'\\', r'\\\\'), "unicode_escape")
|
||||||
|
unichr = unichr
|
||||||
|
int2byte = chr
|
||||||
|
|
||||||
|
def byte2int(bs):
|
||||||
|
return ord(bs[0])
|
||||||
|
|
||||||
|
def indexbytes(buf, i):
|
||||||
|
return ord(buf[i])
|
||||||
|
iterbytes = functools.partial(itertools.imap, ord)
|
||||||
|
import StringIO
|
||||||
|
StringIO = BytesIO = StringIO.StringIO
|
||||||
|
_assertCountEqual = "assertItemsEqual"
|
||||||
|
_assertRaisesRegex = "assertRaisesRegexp"
|
||||||
|
_assertRegex = "assertRegexpMatches"
|
||||||
|
_add_doc(b, """Byte literal""")
|
||||||
|
_add_doc(u, """Text literal""")
|
||||||
|
|
||||||
|
|
||||||
|
def assertCountEqual(self, *args, **kwargs):
|
||||||
|
return getattr(self, _assertCountEqual)(*args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
def assertRaisesRegex(self, *args, **kwargs):
|
||||||
|
return getattr(self, _assertRaisesRegex)(*args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
def assertRegex(self, *args, **kwargs):
|
||||||
|
return getattr(self, _assertRegex)(*args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
if PY3:
|
||||||
|
exec_ = getattr(moves.builtins, "exec")
|
||||||
|
|
||||||
|
def reraise(tp, value, tb=None):
|
||||||
|
if value is None:
|
||||||
|
value = tp()
|
||||||
|
if value.__traceback__ is not tb:
|
||||||
|
raise value.with_traceback(tb)
|
||||||
|
raise value
|
||||||
|
|
||||||
|
else:
|
||||||
|
def exec_(_code_, _globs_=None, _locs_=None):
|
||||||
|
"""Execute code in a namespace."""
|
||||||
|
if _globs_ is None:
|
||||||
|
frame = sys._getframe(1)
|
||||||
|
_globs_ = frame.f_globals
|
||||||
|
if _locs_ is None:
|
||||||
|
_locs_ = frame.f_locals
|
||||||
|
del frame
|
||||||
|
elif _locs_ is None:
|
||||||
|
_locs_ = _globs_
|
||||||
|
exec("""exec _code_ in _globs_, _locs_""")
|
||||||
|
|
||||||
|
exec_("""def reraise(tp, value, tb=None):
|
||||||
|
raise tp, value, tb
|
||||||
|
""")
|
||||||
|
|
||||||
|
|
||||||
|
if sys.version_info[:2] == (3, 2):
|
||||||
|
exec_("""def raise_from(value, from_value):
|
||||||
|
if from_value is None:
|
||||||
|
raise value
|
||||||
|
raise value from from_value
|
||||||
|
""")
|
||||||
|
elif sys.version_info[:2] > (3, 2):
|
||||||
|
exec_("""def raise_from(value, from_value):
|
||||||
|
raise value from from_value
|
||||||
|
""")
|
||||||
|
else:
|
||||||
|
def raise_from(value, from_value):
|
||||||
|
raise value
|
||||||
|
|
||||||
|
|
||||||
|
print_ = getattr(moves.builtins, "print", None)
|
||||||
|
if print_ is None:
|
||||||
|
def print_(*args, **kwargs):
|
||||||
|
"""The new-style print function for Python 2.4 and 2.5."""
|
||||||
|
fp = kwargs.pop("file", sys.stdout)
|
||||||
|
if fp is None:
|
||||||
|
return
|
||||||
|
|
||||||
|
def write(data):
|
||||||
|
if not isinstance(data, basestring):
|
||||||
|
data = str(data)
|
||||||
|
# If the file has an encoding, encode unicode with it.
|
||||||
|
if (isinstance(fp, file) and
|
||||||
|
isinstance(data, unicode) and
|
||||||
|
fp.encoding is not None):
|
||||||
|
errors = getattr(fp, "errors", None)
|
||||||
|
if errors is None:
|
||||||
|
errors = "strict"
|
||||||
|
data = data.encode(fp.encoding, errors)
|
||||||
|
fp.write(data)
|
||||||
|
want_unicode = False
|
||||||
|
sep = kwargs.pop("sep", None)
|
||||||
|
if sep is not None:
|
||||||
|
if isinstance(sep, unicode):
|
||||||
|
want_unicode = True
|
||||||
|
elif not isinstance(sep, str):
|
||||||
|
raise TypeError("sep must be None or a string")
|
||||||
|
end = kwargs.pop("end", None)
|
||||||
|
if end is not None:
|
||||||
|
if isinstance(end, unicode):
|
||||||
|
want_unicode = True
|
||||||
|
elif not isinstance(end, str):
|
||||||
|
raise TypeError("end must be None or a string")
|
||||||
|
if kwargs:
|
||||||
|
raise TypeError("invalid keyword arguments to print()")
|
||||||
|
if not want_unicode:
|
||||||
|
for arg in args:
|
||||||
|
if isinstance(arg, unicode):
|
||||||
|
want_unicode = True
|
||||||
|
break
|
||||||
|
if want_unicode:
|
||||||
|
newline = unicode("\n")
|
||||||
|
space = unicode(" ")
|
||||||
|
else:
|
||||||
|
newline = "\n"
|
||||||
|
space = " "
|
||||||
|
if sep is None:
|
||||||
|
sep = space
|
||||||
|
if end is None:
|
||||||
|
end = newline
|
||||||
|
for i, arg in enumerate(args):
|
||||||
|
if i:
|
||||||
|
write(sep)
|
||||||
|
write(arg)
|
||||||
|
write(end)
|
||||||
|
if sys.version_info[:2] < (3, 3):
|
||||||
|
_print = print_
|
||||||
|
|
||||||
|
def print_(*args, **kwargs):
|
||||||
|
fp = kwargs.get("file", sys.stdout)
|
||||||
|
flush = kwargs.pop("flush", False)
|
||||||
|
_print(*args, **kwargs)
|
||||||
|
if flush and fp is not None:
|
||||||
|
fp.flush()
|
||||||
|
|
||||||
|
_add_doc(reraise, """Reraise an exception.""")
|
||||||
|
|
||||||
|
if sys.version_info[0:2] < (3, 4):
|
||||||
|
def wraps(wrapped, assigned=functools.WRAPPER_ASSIGNMENTS,
|
||||||
|
updated=functools.WRAPPER_UPDATES):
|
||||||
|
def wrapper(f):
|
||||||
|
f = functools.wraps(wrapped, assigned, updated)(f)
|
||||||
|
f.__wrapped__ = wrapped
|
||||||
|
return f
|
||||||
|
return wrapper
|
||||||
|
else:
|
||||||
|
wraps = functools.wraps
|
||||||
|
|
||||||
|
|
||||||
|
def with_metaclass(meta, *bases):
|
||||||
|
"""Create a base class with a metaclass."""
|
||||||
|
# This requires a bit of explanation: the basic idea is to make a dummy
|
||||||
|
# metaclass for one level of class instantiation that replaces itself with
|
||||||
|
# the actual metaclass.
|
||||||
|
class metaclass(meta):
|
||||||
|
|
||||||
|
def __new__(cls, name, this_bases, d):
|
||||||
|
return meta(name, bases, d)
|
||||||
|
return type.__new__(metaclass, 'temporary_class', (), {})
|
||||||
|
|
||||||
|
|
||||||
|
def add_metaclass(metaclass):
|
||||||
|
"""Class decorator for creating a class with a metaclass."""
|
||||||
|
def wrapper(cls):
|
||||||
|
orig_vars = cls.__dict__.copy()
|
||||||
|
slots = orig_vars.get('__slots__')
|
||||||
|
if slots is not None:
|
||||||
|
if isinstance(slots, str):
|
||||||
|
slots = [slots]
|
||||||
|
for slots_var in slots:
|
||||||
|
orig_vars.pop(slots_var)
|
||||||
|
orig_vars.pop('__dict__', None)
|
||||||
|
orig_vars.pop('__weakref__', None)
|
||||||
|
return metaclass(cls.__name__, cls.__bases__, orig_vars)
|
||||||
|
return wrapper
|
||||||
|
|
||||||
|
|
||||||
|
def python_2_unicode_compatible(klass):
|
||||||
|
"""
|
||||||
|
A decorator that defines __unicode__ and __str__ methods under Python 2.
|
||||||
|
Under Python 3 it does nothing.
|
||||||
|
|
||||||
|
To support Python 2 and 3 with a single code base, define a __str__ method
|
||||||
|
returning text and apply this decorator to the class.
|
||||||
|
"""
|
||||||
|
if PY2:
|
||||||
|
if '__str__' not in klass.__dict__:
|
||||||
|
raise ValueError("@python_2_unicode_compatible cannot be applied "
|
||||||
|
"to %s because it doesn't define __str__()." %
|
||||||
|
klass.__name__)
|
||||||
|
klass.__unicode__ = klass.__str__
|
||||||
|
klass.__str__ = lambda self: self.__unicode__().encode('utf-8')
|
||||||
|
return klass
|
||||||
|
|
||||||
|
|
||||||
|
# Complete the moves implementation.
|
||||||
|
# This code is at the end of this module to speed up module loading.
|
||||||
|
# Turn this module into a package.
|
||||||
|
__path__ = [] # required for PEP 302 and PEP 451
|
||||||
|
__package__ = __name__ # see PEP 366 @ReservedAssignment
|
||||||
|
if globals().get("__spec__") is not None:
|
||||||
|
__spec__.submodule_search_locations = [] # PEP 451 @UndefinedVariable
|
||||||
|
# Remove other six meta path importers, since they cause problems. This can
|
||||||
|
# happen if six is removed from sys.modules and then reloaded. (Setuptools does
|
||||||
|
# this for some reason.)
|
||||||
|
if sys.meta_path:
|
||||||
|
for i, importer in enumerate(sys.meta_path):
|
||||||
|
# Here's some real nastiness: Another "instance" of the six module might
|
||||||
|
# be floating around. Therefore, we can't use isinstance() to check for
|
||||||
|
# the six meta path importer, since the other six instance will have
|
||||||
|
# inserted an importer with different class.
|
||||||
|
if (type(importer).__name__ == "_SixMetaPathImporter" and
|
||||||
|
importer.name == __name__):
|
||||||
|
del sys.meta_path[i]
|
||||||
|
break
|
||||||
|
del i, importer
|
||||||
|
# Finally, add the importer to the meta path import hook.
|
||||||
|
sys.meta_path.append(_importer)
|
||||||
@@ -0,0 +1,173 @@
|
|||||||
|
"""Utilities for extracting common archive formats"""
|
||||||
|
|
||||||
|
import zipfile
|
||||||
|
import tarfile
|
||||||
|
import os
|
||||||
|
import shutil
|
||||||
|
import posixpath
|
||||||
|
import contextlib
|
||||||
|
from distutils.errors import DistutilsError
|
||||||
|
|
||||||
|
from pkg_resources import ensure_directory
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
"unpack_archive", "unpack_zipfile", "unpack_tarfile", "default_filter",
|
||||||
|
"UnrecognizedFormat", "extraction_drivers", "unpack_directory",
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
class UnrecognizedFormat(DistutilsError):
|
||||||
|
"""Couldn't recognize the archive type"""
|
||||||
|
|
||||||
|
|
||||||
|
def default_filter(src, dst):
|
||||||
|
"""The default progress/filter callback; returns True for all files"""
|
||||||
|
return dst
|
||||||
|
|
||||||
|
|
||||||
|
def unpack_archive(filename, extract_dir, progress_filter=default_filter,
|
||||||
|
drivers=None):
|
||||||
|
"""Unpack `filename` to `extract_dir`, or raise ``UnrecognizedFormat``
|
||||||
|
|
||||||
|
`progress_filter` is a function taking two arguments: a source path
|
||||||
|
internal to the archive ('/'-separated), and a filesystem path where it
|
||||||
|
will be extracted. The callback must return the desired extract path
|
||||||
|
(which may be the same as the one passed in), or else ``None`` to skip
|
||||||
|
that file or directory. The callback can thus be used to report on the
|
||||||
|
progress of the extraction, as well as to filter the items extracted or
|
||||||
|
alter their extraction paths.
|
||||||
|
|
||||||
|
`drivers`, if supplied, must be a non-empty sequence of functions with the
|
||||||
|
same signature as this function (minus the `drivers` argument), that raise
|
||||||
|
``UnrecognizedFormat`` if they do not support extracting the designated
|
||||||
|
archive type. The `drivers` are tried in sequence until one is found that
|
||||||
|
does not raise an error, or until all are exhausted (in which case
|
||||||
|
``UnrecognizedFormat`` is raised). If you do not supply a sequence of
|
||||||
|
drivers, the module's ``extraction_drivers`` constant will be used, which
|
||||||
|
means that ``unpack_zipfile`` and ``unpack_tarfile`` will be tried, in that
|
||||||
|
order.
|
||||||
|
"""
|
||||||
|
for driver in drivers or extraction_drivers:
|
||||||
|
try:
|
||||||
|
driver(filename, extract_dir, progress_filter)
|
||||||
|
except UnrecognizedFormat:
|
||||||
|
continue
|
||||||
|
else:
|
||||||
|
return
|
||||||
|
else:
|
||||||
|
raise UnrecognizedFormat(
|
||||||
|
"Not a recognized archive type: %s" % filename
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def unpack_directory(filename, extract_dir, progress_filter=default_filter):
|
||||||
|
""""Unpack" a directory, using the same interface as for archives
|
||||||
|
|
||||||
|
Raises ``UnrecognizedFormat`` if `filename` is not a directory
|
||||||
|
"""
|
||||||
|
if not os.path.isdir(filename):
|
||||||
|
raise UnrecognizedFormat("%s is not a directory" % filename)
|
||||||
|
|
||||||
|
paths = {
|
||||||
|
filename: ('', extract_dir),
|
||||||
|
}
|
||||||
|
for base, dirs, files in os.walk(filename):
|
||||||
|
src, dst = paths[base]
|
||||||
|
for d in dirs:
|
||||||
|
paths[os.path.join(base, d)] = src + d + '/', os.path.join(dst, d)
|
||||||
|
for f in files:
|
||||||
|
target = os.path.join(dst, f)
|
||||||
|
target = progress_filter(src + f, target)
|
||||||
|
if not target:
|
||||||
|
# skip non-files
|
||||||
|
continue
|
||||||
|
ensure_directory(target)
|
||||||
|
f = os.path.join(base, f)
|
||||||
|
shutil.copyfile(f, target)
|
||||||
|
shutil.copystat(f, target)
|
||||||
|
|
||||||
|
|
||||||
|
def unpack_zipfile(filename, extract_dir, progress_filter=default_filter):
|
||||||
|
"""Unpack zip `filename` to `extract_dir`
|
||||||
|
|
||||||
|
Raises ``UnrecognizedFormat`` if `filename` is not a zipfile (as determined
|
||||||
|
by ``zipfile.is_zipfile()``). See ``unpack_archive()`` for an explanation
|
||||||
|
of the `progress_filter` argument.
|
||||||
|
"""
|
||||||
|
|
||||||
|
if not zipfile.is_zipfile(filename):
|
||||||
|
raise UnrecognizedFormat("%s is not a zip file" % (filename,))
|
||||||
|
|
||||||
|
with zipfile.ZipFile(filename) as z:
|
||||||
|
for info in z.infolist():
|
||||||
|
name = info.filename
|
||||||
|
|
||||||
|
# don't extract absolute paths or ones with .. in them
|
||||||
|
if name.startswith('/') or '..' in name.split('/'):
|
||||||
|
continue
|
||||||
|
|
||||||
|
target = os.path.join(extract_dir, *name.split('/'))
|
||||||
|
target = progress_filter(name, target)
|
||||||
|
if not target:
|
||||||
|
continue
|
||||||
|
if name.endswith('/'):
|
||||||
|
# directory
|
||||||
|
ensure_directory(target)
|
||||||
|
else:
|
||||||
|
# file
|
||||||
|
ensure_directory(target)
|
||||||
|
data = z.read(info.filename)
|
||||||
|
with open(target, 'wb') as f:
|
||||||
|
f.write(data)
|
||||||
|
unix_attributes = info.external_attr >> 16
|
||||||
|
if unix_attributes:
|
||||||
|
os.chmod(target, unix_attributes)
|
||||||
|
|
||||||
|
|
||||||
|
def unpack_tarfile(filename, extract_dir, progress_filter=default_filter):
|
||||||
|
"""Unpack tar/tar.gz/tar.bz2 `filename` to `extract_dir`
|
||||||
|
|
||||||
|
Raises ``UnrecognizedFormat`` if `filename` is not a tarfile (as determined
|
||||||
|
by ``tarfile.open()``). See ``unpack_archive()`` for an explanation
|
||||||
|
of the `progress_filter` argument.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
tarobj = tarfile.open(filename)
|
||||||
|
except tarfile.TarError:
|
||||||
|
raise UnrecognizedFormat(
|
||||||
|
"%s is not a compressed or uncompressed tar file" % (filename,)
|
||||||
|
)
|
||||||
|
with contextlib.closing(tarobj):
|
||||||
|
# don't do any chowning!
|
||||||
|
tarobj.chown = lambda *args: None
|
||||||
|
for member in tarobj:
|
||||||
|
name = member.name
|
||||||
|
# don't extract absolute paths or ones with .. in them
|
||||||
|
if not name.startswith('/') and '..' not in name.split('/'):
|
||||||
|
prelim_dst = os.path.join(extract_dir, *name.split('/'))
|
||||||
|
|
||||||
|
# resolve any links and to extract the link targets as normal
|
||||||
|
# files
|
||||||
|
while member is not None and (member.islnk() or member.issym()):
|
||||||
|
linkpath = member.linkname
|
||||||
|
if member.issym():
|
||||||
|
base = posixpath.dirname(member.name)
|
||||||
|
linkpath = posixpath.join(base, linkpath)
|
||||||
|
linkpath = posixpath.normpath(linkpath)
|
||||||
|
member = tarobj._getmember(linkpath)
|
||||||
|
|
||||||
|
if member is not None and (member.isfile() or member.isdir()):
|
||||||
|
final_dst = progress_filter(name, prelim_dst)
|
||||||
|
if final_dst:
|
||||||
|
if final_dst.endswith(os.sep):
|
||||||
|
final_dst = final_dst[:-1]
|
||||||
|
try:
|
||||||
|
# XXX Ugh
|
||||||
|
tarobj._extract_member(member, final_dst)
|
||||||
|
except tarfile.ExtractError:
|
||||||
|
# chown/chmod/mkfifo/mknode/makedev failed
|
||||||
|
pass
|
||||||
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
extraction_drivers = unpack_directory, unpack_zipfile, unpack_tarfile
|
||||||
@@ -0,0 +1,172 @@
|
|||||||
|
"""A PEP 517 interface to setuptools
|
||||||
|
|
||||||
|
Previously, when a user or a command line tool (let's call it a "frontend")
|
||||||
|
needed to make a request of setuptools to take a certain action, for
|
||||||
|
example, generating a list of installation requirements, the frontend would
|
||||||
|
would call "setup.py egg_info" or "setup.py bdist_wheel" on the command line.
|
||||||
|
|
||||||
|
PEP 517 defines a different method of interfacing with setuptools. Rather
|
||||||
|
than calling "setup.py" directly, the frontend should:
|
||||||
|
|
||||||
|
1. Set the current directory to the directory with a setup.py file
|
||||||
|
2. Import this module into a safe python interpreter (one in which
|
||||||
|
setuptools can potentially set global variables or crash hard).
|
||||||
|
3. Call one of the functions defined in PEP 517.
|
||||||
|
|
||||||
|
What each function does is defined in PEP 517. However, here is a "casual"
|
||||||
|
definition of the functions (this definition should not be relied on for
|
||||||
|
bug reports or API stability):
|
||||||
|
|
||||||
|
- `build_wheel`: build a wheel in the folder and return the basename
|
||||||
|
- `get_requires_for_build_wheel`: get the `setup_requires` to build
|
||||||
|
- `prepare_metadata_for_build_wheel`: get the `install_requires`
|
||||||
|
- `build_sdist`: build an sdist in the folder and return the basename
|
||||||
|
- `get_requires_for_build_sdist`: get the `setup_requires` to build
|
||||||
|
|
||||||
|
Again, this is not a formal definition! Just a "taste" of the module.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import tokenize
|
||||||
|
import shutil
|
||||||
|
import contextlib
|
||||||
|
|
||||||
|
import setuptools
|
||||||
|
import distutils
|
||||||
|
|
||||||
|
|
||||||
|
class SetupRequirementsError(BaseException):
|
||||||
|
def __init__(self, specifiers):
|
||||||
|
self.specifiers = specifiers
|
||||||
|
|
||||||
|
|
||||||
|
class Distribution(setuptools.dist.Distribution):
|
||||||
|
def fetch_build_eggs(self, specifiers):
|
||||||
|
raise SetupRequirementsError(specifiers)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
@contextlib.contextmanager
|
||||||
|
def patch(cls):
|
||||||
|
"""
|
||||||
|
Replace
|
||||||
|
distutils.dist.Distribution with this class
|
||||||
|
for the duration of this context.
|
||||||
|
"""
|
||||||
|
orig = distutils.core.Distribution
|
||||||
|
distutils.core.Distribution = cls
|
||||||
|
try:
|
||||||
|
yield
|
||||||
|
finally:
|
||||||
|
distutils.core.Distribution = orig
|
||||||
|
|
||||||
|
|
||||||
|
def _run_setup(setup_script='setup.py'):
|
||||||
|
# Note that we can reuse our build directory between calls
|
||||||
|
# Correctness comes first, then optimization later
|
||||||
|
__file__ = setup_script
|
||||||
|
__name__ = '__main__'
|
||||||
|
f = getattr(tokenize, 'open', open)(__file__)
|
||||||
|
code = f.read().replace('\\r\\n', '\\n')
|
||||||
|
f.close()
|
||||||
|
exec(compile(code, __file__, 'exec'), locals())
|
||||||
|
|
||||||
|
|
||||||
|
def _fix_config(config_settings):
|
||||||
|
config_settings = config_settings or {}
|
||||||
|
config_settings.setdefault('--global-option', [])
|
||||||
|
return config_settings
|
||||||
|
|
||||||
|
|
||||||
|
def _get_build_requires(config_settings):
|
||||||
|
config_settings = _fix_config(config_settings)
|
||||||
|
requirements = ['setuptools', 'wheel']
|
||||||
|
|
||||||
|
sys.argv = sys.argv[:1] + ['egg_info'] + \
|
||||||
|
config_settings["--global-option"]
|
||||||
|
try:
|
||||||
|
with Distribution.patch():
|
||||||
|
_run_setup()
|
||||||
|
except SetupRequirementsError as e:
|
||||||
|
requirements += e.specifiers
|
||||||
|
|
||||||
|
return requirements
|
||||||
|
|
||||||
|
|
||||||
|
def _get_immediate_subdirectories(a_dir):
|
||||||
|
return [name for name in os.listdir(a_dir)
|
||||||
|
if os.path.isdir(os.path.join(a_dir, name))]
|
||||||
|
|
||||||
|
|
||||||
|
def get_requires_for_build_wheel(config_settings=None):
|
||||||
|
config_settings = _fix_config(config_settings)
|
||||||
|
return _get_build_requires(config_settings)
|
||||||
|
|
||||||
|
|
||||||
|
def get_requires_for_build_sdist(config_settings=None):
|
||||||
|
config_settings = _fix_config(config_settings)
|
||||||
|
return _get_build_requires(config_settings)
|
||||||
|
|
||||||
|
|
||||||
|
def prepare_metadata_for_build_wheel(metadata_directory, config_settings=None):
|
||||||
|
sys.argv = sys.argv[:1] + ['dist_info', '--egg-base', metadata_directory]
|
||||||
|
_run_setup()
|
||||||
|
|
||||||
|
dist_info_directory = metadata_directory
|
||||||
|
while True:
|
||||||
|
dist_infos = [f for f in os.listdir(dist_info_directory)
|
||||||
|
if f.endswith('.dist-info')]
|
||||||
|
|
||||||
|
if len(dist_infos) == 0 and \
|
||||||
|
len(_get_immediate_subdirectories(dist_info_directory)) == 1:
|
||||||
|
dist_info_directory = os.path.join(
|
||||||
|
dist_info_directory, os.listdir(dist_info_directory)[0])
|
||||||
|
continue
|
||||||
|
|
||||||
|
assert len(dist_infos) == 1
|
||||||
|
break
|
||||||
|
|
||||||
|
# PEP 517 requires that the .dist-info directory be placed in the
|
||||||
|
# metadata_directory. To comply, we MUST copy the directory to the root
|
||||||
|
if dist_info_directory != metadata_directory:
|
||||||
|
shutil.move(
|
||||||
|
os.path.join(dist_info_directory, dist_infos[0]),
|
||||||
|
metadata_directory)
|
||||||
|
shutil.rmtree(dist_info_directory, ignore_errors=True)
|
||||||
|
|
||||||
|
return dist_infos[0]
|
||||||
|
|
||||||
|
|
||||||
|
def build_wheel(wheel_directory, config_settings=None,
|
||||||
|
metadata_directory=None):
|
||||||
|
config_settings = _fix_config(config_settings)
|
||||||
|
wheel_directory = os.path.abspath(wheel_directory)
|
||||||
|
sys.argv = sys.argv[:1] + ['bdist_wheel'] + \
|
||||||
|
config_settings["--global-option"]
|
||||||
|
_run_setup()
|
||||||
|
if wheel_directory != 'dist':
|
||||||
|
shutil.rmtree(wheel_directory)
|
||||||
|
shutil.copytree('dist', wheel_directory)
|
||||||
|
|
||||||
|
wheels = [f for f in os.listdir(wheel_directory)
|
||||||
|
if f.endswith('.whl')]
|
||||||
|
|
||||||
|
assert len(wheels) == 1
|
||||||
|
return wheels[0]
|
||||||
|
|
||||||
|
|
||||||
|
def build_sdist(sdist_directory, config_settings=None):
|
||||||
|
config_settings = _fix_config(config_settings)
|
||||||
|
sdist_directory = os.path.abspath(sdist_directory)
|
||||||
|
sys.argv = sys.argv[:1] + ['sdist'] + \
|
||||||
|
config_settings["--global-option"]
|
||||||
|
_run_setup()
|
||||||
|
if sdist_directory != 'dist':
|
||||||
|
shutil.rmtree(sdist_directory)
|
||||||
|
shutil.copytree('dist', sdist_directory)
|
||||||
|
|
||||||
|
sdists = [f for f in os.listdir(sdist_directory)
|
||||||
|
if f.endswith('.tar.gz')]
|
||||||
|
|
||||||
|
assert len(sdists) == 1
|
||||||
|
return sdists[0]
|
||||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
@@ -0,0 +1,18 @@
|
|||||||
|
__all__ = [
|
||||||
|
'alias', 'bdist_egg', 'bdist_rpm', 'build_ext', 'build_py', 'develop',
|
||||||
|
'easy_install', 'egg_info', 'install', 'install_lib', 'rotate', 'saveopts',
|
||||||
|
'sdist', 'setopt', 'test', 'install_egg_info', 'install_scripts',
|
||||||
|
'register', 'bdist_wininst', 'upload_docs', 'upload', 'build_clib',
|
||||||
|
'dist_info',
|
||||||
|
]
|
||||||
|
|
||||||
|
from distutils.command.bdist import bdist
|
||||||
|
import sys
|
||||||
|
|
||||||
|
from setuptools.command import install_scripts
|
||||||
|
|
||||||
|
if 'egg' not in bdist.format_commands:
|
||||||
|
bdist.format_command['egg'] = ('bdist_egg', "Python .egg file")
|
||||||
|
bdist.format_commands.append('egg')
|
||||||
|
|
||||||
|
del bdist, sys
|
||||||
@@ -0,0 +1,80 @@
|
|||||||
|
from distutils.errors import DistutilsOptionError
|
||||||
|
|
||||||
|
from setuptools.extern.six.moves import map
|
||||||
|
|
||||||
|
from setuptools.command.setopt import edit_config, option_base, config_file
|
||||||
|
|
||||||
|
|
||||||
|
def shquote(arg):
|
||||||
|
"""Quote an argument for later parsing by shlex.split()"""
|
||||||
|
for c in '"', "'", "\\", "#":
|
||||||
|
if c in arg:
|
||||||
|
return repr(arg)
|
||||||
|
if arg.split() != [arg]:
|
||||||
|
return repr(arg)
|
||||||
|
return arg
|
||||||
|
|
||||||
|
|
||||||
|
class alias(option_base):
|
||||||
|
"""Define a shortcut that invokes one or more commands"""
|
||||||
|
|
||||||
|
description = "define a shortcut to invoke one or more commands"
|
||||||
|
command_consumes_arguments = True
|
||||||
|
|
||||||
|
user_options = [
|
||||||
|
('remove', 'r', 'remove (unset) the alias'),
|
||||||
|
] + option_base.user_options
|
||||||
|
|
||||||
|
boolean_options = option_base.boolean_options + ['remove']
|
||||||
|
|
||||||
|
def initialize_options(self):
|
||||||
|
option_base.initialize_options(self)
|
||||||
|
self.args = None
|
||||||
|
self.remove = None
|
||||||
|
|
||||||
|
def finalize_options(self):
|
||||||
|
option_base.finalize_options(self)
|
||||||
|
if self.remove and len(self.args) != 1:
|
||||||
|
raise DistutilsOptionError(
|
||||||
|
"Must specify exactly one argument (the alias name) when "
|
||||||
|
"using --remove"
|
||||||
|
)
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
aliases = self.distribution.get_option_dict('aliases')
|
||||||
|
|
||||||
|
if not self.args:
|
||||||
|
print("Command Aliases")
|
||||||
|
print("---------------")
|
||||||
|
for alias in aliases:
|
||||||
|
print("setup.py alias", format_alias(alias, aliases))
|
||||||
|
return
|
||||||
|
|
||||||
|
elif len(self.args) == 1:
|
||||||
|
alias, = self.args
|
||||||
|
if self.remove:
|
||||||
|
command = None
|
||||||
|
elif alias in aliases:
|
||||||
|
print("setup.py alias", format_alias(alias, aliases))
|
||||||
|
return
|
||||||
|
else:
|
||||||
|
print("No alias definition found for %r" % alias)
|
||||||
|
return
|
||||||
|
else:
|
||||||
|
alias = self.args[0]
|
||||||
|
command = ' '.join(map(shquote, self.args[1:]))
|
||||||
|
|
||||||
|
edit_config(self.filename, {'aliases': {alias: command}}, self.dry_run)
|
||||||
|
|
||||||
|
|
||||||
|
def format_alias(name, aliases):
|
||||||
|
source, command = aliases[name]
|
||||||
|
if source == config_file('global'):
|
||||||
|
source = '--global-config '
|
||||||
|
elif source == config_file('user'):
|
||||||
|
source = '--user-config '
|
||||||
|
elif source == config_file('local'):
|
||||||
|
source = ''
|
||||||
|
else:
|
||||||
|
source = '--filename=%r' % source
|
||||||
|
return source + name + ' ' + command
|
||||||
@@ -0,0 +1,502 @@
|
|||||||
|
"""setuptools.command.bdist_egg
|
||||||
|
|
||||||
|
Build .egg distributions"""
|
||||||
|
|
||||||
|
from distutils.errors import DistutilsSetupError
|
||||||
|
from distutils.dir_util import remove_tree, mkpath
|
||||||
|
from distutils import log
|
||||||
|
from types import CodeType
|
||||||
|
import sys
|
||||||
|
import os
|
||||||
|
import re
|
||||||
|
import textwrap
|
||||||
|
import marshal
|
||||||
|
|
||||||
|
from setuptools.extern import six
|
||||||
|
|
||||||
|
from pkg_resources import get_build_platform, Distribution, ensure_directory
|
||||||
|
from pkg_resources import EntryPoint
|
||||||
|
from setuptools.extension import Library
|
||||||
|
from setuptools import Command
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Python 2.7 or >=3.2
|
||||||
|
from sysconfig import get_path, get_python_version
|
||||||
|
|
||||||
|
def _get_purelib():
|
||||||
|
return get_path("purelib")
|
||||||
|
except ImportError:
|
||||||
|
from distutils.sysconfig import get_python_lib, get_python_version
|
||||||
|
|
||||||
|
def _get_purelib():
|
||||||
|
return get_python_lib(False)
|
||||||
|
|
||||||
|
|
||||||
|
def strip_module(filename):
|
||||||
|
if '.' in filename:
|
||||||
|
filename = os.path.splitext(filename)[0]
|
||||||
|
if filename.endswith('module'):
|
||||||
|
filename = filename[:-6]
|
||||||
|
return filename
|
||||||
|
|
||||||
|
|
||||||
|
def sorted_walk(dir):
|
||||||
|
"""Do os.walk in a reproducible way,
|
||||||
|
independent of indeterministic filesystem readdir order
|
||||||
|
"""
|
||||||
|
for base, dirs, files in os.walk(dir):
|
||||||
|
dirs.sort()
|
||||||
|
files.sort()
|
||||||
|
yield base, dirs, files
|
||||||
|
|
||||||
|
|
||||||
|
def write_stub(resource, pyfile):
|
||||||
|
_stub_template = textwrap.dedent("""
|
||||||
|
def __bootstrap__():
|
||||||
|
global __bootstrap__, __loader__, __file__
|
||||||
|
import sys, pkg_resources, imp
|
||||||
|
__file__ = pkg_resources.resource_filename(__name__, %r)
|
||||||
|
__loader__ = None; del __bootstrap__, __loader__
|
||||||
|
imp.load_dynamic(__name__,__file__)
|
||||||
|
__bootstrap__()
|
||||||
|
""").lstrip()
|
||||||
|
with open(pyfile, 'w') as f:
|
||||||
|
f.write(_stub_template % resource)
|
||||||
|
|
||||||
|
|
||||||
|
class bdist_egg(Command):
|
||||||
|
description = "create an \"egg\" distribution"
|
||||||
|
|
||||||
|
user_options = [
|
||||||
|
('bdist-dir=', 'b',
|
||||||
|
"temporary directory for creating the distribution"),
|
||||||
|
('plat-name=', 'p', "platform name to embed in generated filenames "
|
||||||
|
"(default: %s)" % get_build_platform()),
|
||||||
|
('exclude-source-files', None,
|
||||||
|
"remove all .py files from the generated egg"),
|
||||||
|
('keep-temp', 'k',
|
||||||
|
"keep the pseudo-installation tree around after " +
|
||||||
|
"creating the distribution archive"),
|
||||||
|
('dist-dir=', 'd',
|
||||||
|
"directory to put final built distributions in"),
|
||||||
|
('skip-build', None,
|
||||||
|
"skip rebuilding everything (for testing/debugging)"),
|
||||||
|
]
|
||||||
|
|
||||||
|
boolean_options = [
|
||||||
|
'keep-temp', 'skip-build', 'exclude-source-files'
|
||||||
|
]
|
||||||
|
|
||||||
|
def initialize_options(self):
|
||||||
|
self.bdist_dir = None
|
||||||
|
self.plat_name = None
|
||||||
|
self.keep_temp = 0
|
||||||
|
self.dist_dir = None
|
||||||
|
self.skip_build = 0
|
||||||
|
self.egg_output = None
|
||||||
|
self.exclude_source_files = None
|
||||||
|
|
||||||
|
def finalize_options(self):
|
||||||
|
ei_cmd = self.ei_cmd = self.get_finalized_command("egg_info")
|
||||||
|
self.egg_info = ei_cmd.egg_info
|
||||||
|
|
||||||
|
if self.bdist_dir is None:
|
||||||
|
bdist_base = self.get_finalized_command('bdist').bdist_base
|
||||||
|
self.bdist_dir = os.path.join(bdist_base, 'egg')
|
||||||
|
|
||||||
|
if self.plat_name is None:
|
||||||
|
self.plat_name = get_build_platform()
|
||||||
|
|
||||||
|
self.set_undefined_options('bdist', ('dist_dir', 'dist_dir'))
|
||||||
|
|
||||||
|
if self.egg_output is None:
|
||||||
|
|
||||||
|
# Compute filename of the output egg
|
||||||
|
basename = Distribution(
|
||||||
|
None, None, ei_cmd.egg_name, ei_cmd.egg_version,
|
||||||
|
get_python_version(),
|
||||||
|
self.distribution.has_ext_modules() and self.plat_name
|
||||||
|
).egg_name()
|
||||||
|
|
||||||
|
self.egg_output = os.path.join(self.dist_dir, basename + '.egg')
|
||||||
|
|
||||||
|
def do_install_data(self):
|
||||||
|
# Hack for packages that install data to install's --install-lib
|
||||||
|
self.get_finalized_command('install').install_lib = self.bdist_dir
|
||||||
|
|
||||||
|
site_packages = os.path.normcase(os.path.realpath(_get_purelib()))
|
||||||
|
old, self.distribution.data_files = self.distribution.data_files, []
|
||||||
|
|
||||||
|
for item in old:
|
||||||
|
if isinstance(item, tuple) and len(item) == 2:
|
||||||
|
if os.path.isabs(item[0]):
|
||||||
|
realpath = os.path.realpath(item[0])
|
||||||
|
normalized = os.path.normcase(realpath)
|
||||||
|
if normalized == site_packages or normalized.startswith(
|
||||||
|
site_packages + os.sep
|
||||||
|
):
|
||||||
|
item = realpath[len(site_packages) + 1:], item[1]
|
||||||
|
# XXX else: raise ???
|
||||||
|
self.distribution.data_files.append(item)
|
||||||
|
|
||||||
|
try:
|
||||||
|
log.info("installing package data to %s", self.bdist_dir)
|
||||||
|
self.call_command('install_data', force=0, root=None)
|
||||||
|
finally:
|
||||||
|
self.distribution.data_files = old
|
||||||
|
|
||||||
|
def get_outputs(self):
|
||||||
|
return [self.egg_output]
|
||||||
|
|
||||||
|
def call_command(self, cmdname, **kw):
|
||||||
|
"""Invoke reinitialized command `cmdname` with keyword args"""
|
||||||
|
for dirname in INSTALL_DIRECTORY_ATTRS:
|
||||||
|
kw.setdefault(dirname, self.bdist_dir)
|
||||||
|
kw.setdefault('skip_build', self.skip_build)
|
||||||
|
kw.setdefault('dry_run', self.dry_run)
|
||||||
|
cmd = self.reinitialize_command(cmdname, **kw)
|
||||||
|
self.run_command(cmdname)
|
||||||
|
return cmd
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
# Generate metadata first
|
||||||
|
self.run_command("egg_info")
|
||||||
|
# We run install_lib before install_data, because some data hacks
|
||||||
|
# pull their data path from the install_lib command.
|
||||||
|
log.info("installing library code to %s", self.bdist_dir)
|
||||||
|
instcmd = self.get_finalized_command('install')
|
||||||
|
old_root = instcmd.root
|
||||||
|
instcmd.root = None
|
||||||
|
if self.distribution.has_c_libraries() and not self.skip_build:
|
||||||
|
self.run_command('build_clib')
|
||||||
|
cmd = self.call_command('install_lib', warn_dir=0)
|
||||||
|
instcmd.root = old_root
|
||||||
|
|
||||||
|
all_outputs, ext_outputs = self.get_ext_outputs()
|
||||||
|
self.stubs = []
|
||||||
|
to_compile = []
|
||||||
|
for (p, ext_name) in enumerate(ext_outputs):
|
||||||
|
filename, ext = os.path.splitext(ext_name)
|
||||||
|
pyfile = os.path.join(self.bdist_dir, strip_module(filename) +
|
||||||
|
'.py')
|
||||||
|
self.stubs.append(pyfile)
|
||||||
|
log.info("creating stub loader for %s", ext_name)
|
||||||
|
if not self.dry_run:
|
||||||
|
write_stub(os.path.basename(ext_name), pyfile)
|
||||||
|
to_compile.append(pyfile)
|
||||||
|
ext_outputs[p] = ext_name.replace(os.sep, '/')
|
||||||
|
|
||||||
|
if to_compile:
|
||||||
|
cmd.byte_compile(to_compile)
|
||||||
|
if self.distribution.data_files:
|
||||||
|
self.do_install_data()
|
||||||
|
|
||||||
|
# Make the EGG-INFO directory
|
||||||
|
archive_root = self.bdist_dir
|
||||||
|
egg_info = os.path.join(archive_root, 'EGG-INFO')
|
||||||
|
self.mkpath(egg_info)
|
||||||
|
if self.distribution.scripts:
|
||||||
|
script_dir = os.path.join(egg_info, 'scripts')
|
||||||
|
log.info("installing scripts to %s", script_dir)
|
||||||
|
self.call_command('install_scripts', install_dir=script_dir,
|
||||||
|
no_ep=1)
|
||||||
|
|
||||||
|
self.copy_metadata_to(egg_info)
|
||||||
|
native_libs = os.path.join(egg_info, "native_libs.txt")
|
||||||
|
if all_outputs:
|
||||||
|
log.info("writing %s", native_libs)
|
||||||
|
if not self.dry_run:
|
||||||
|
ensure_directory(native_libs)
|
||||||
|
libs_file = open(native_libs, 'wt')
|
||||||
|
libs_file.write('\n'.join(all_outputs))
|
||||||
|
libs_file.write('\n')
|
||||||
|
libs_file.close()
|
||||||
|
elif os.path.isfile(native_libs):
|
||||||
|
log.info("removing %s", native_libs)
|
||||||
|
if not self.dry_run:
|
||||||
|
os.unlink(native_libs)
|
||||||
|
|
||||||
|
write_safety_flag(
|
||||||
|
os.path.join(archive_root, 'EGG-INFO'), self.zip_safe()
|
||||||
|
)
|
||||||
|
|
||||||
|
if os.path.exists(os.path.join(self.egg_info, 'depends.txt')):
|
||||||
|
log.warn(
|
||||||
|
"WARNING: 'depends.txt' will not be used by setuptools 0.6!\n"
|
||||||
|
"Use the install_requires/extras_require setup() args instead."
|
||||||
|
)
|
||||||
|
|
||||||
|
if self.exclude_source_files:
|
||||||
|
self.zap_pyfiles()
|
||||||
|
|
||||||
|
# Make the archive
|
||||||
|
make_zipfile(self.egg_output, archive_root, verbose=self.verbose,
|
||||||
|
dry_run=self.dry_run, mode=self.gen_header())
|
||||||
|
if not self.keep_temp:
|
||||||
|
remove_tree(self.bdist_dir, dry_run=self.dry_run)
|
||||||
|
|
||||||
|
# Add to 'Distribution.dist_files' so that the "upload" command works
|
||||||
|
getattr(self.distribution, 'dist_files', []).append(
|
||||||
|
('bdist_egg', get_python_version(), self.egg_output))
|
||||||
|
|
||||||
|
def zap_pyfiles(self):
|
||||||
|
log.info("Removing .py files from temporary directory")
|
||||||
|
for base, dirs, files in walk_egg(self.bdist_dir):
|
||||||
|
for name in files:
|
||||||
|
path = os.path.join(base, name)
|
||||||
|
|
||||||
|
if name.endswith('.py'):
|
||||||
|
log.debug("Deleting %s", path)
|
||||||
|
os.unlink(path)
|
||||||
|
|
||||||
|
if base.endswith('__pycache__'):
|
||||||
|
path_old = path
|
||||||
|
|
||||||
|
pattern = r'(?P<name>.+)\.(?P<magic>[^.]+)\.pyc'
|
||||||
|
m = re.match(pattern, name)
|
||||||
|
path_new = os.path.join(
|
||||||
|
base, os.pardir, m.group('name') + '.pyc')
|
||||||
|
log.info(
|
||||||
|
"Renaming file from [%s] to [%s]"
|
||||||
|
% (path_old, path_new))
|
||||||
|
try:
|
||||||
|
os.remove(path_new)
|
||||||
|
except OSError:
|
||||||
|
pass
|
||||||
|
os.rename(path_old, path_new)
|
||||||
|
|
||||||
|
def zip_safe(self):
|
||||||
|
safe = getattr(self.distribution, 'zip_safe', None)
|
||||||
|
if safe is not None:
|
||||||
|
return safe
|
||||||
|
log.warn("zip_safe flag not set; analyzing archive contents...")
|
||||||
|
return analyze_egg(self.bdist_dir, self.stubs)
|
||||||
|
|
||||||
|
def gen_header(self):
|
||||||
|
epm = EntryPoint.parse_map(self.distribution.entry_points or '')
|
||||||
|
ep = epm.get('setuptools.installation', {}).get('eggsecutable')
|
||||||
|
if ep is None:
|
||||||
|
return 'w' # not an eggsecutable, do it the usual way.
|
||||||
|
|
||||||
|
if not ep.attrs or ep.extras:
|
||||||
|
raise DistutilsSetupError(
|
||||||
|
"eggsecutable entry point (%r) cannot have 'extras' "
|
||||||
|
"or refer to a module" % (ep,)
|
||||||
|
)
|
||||||
|
|
||||||
|
pyver = sys.version[:3]
|
||||||
|
pkg = ep.module_name
|
||||||
|
full = '.'.join(ep.attrs)
|
||||||
|
base = ep.attrs[0]
|
||||||
|
basename = os.path.basename(self.egg_output)
|
||||||
|
|
||||||
|
header = (
|
||||||
|
"#!/bin/sh\n"
|
||||||
|
'if [ `basename $0` = "%(basename)s" ]\n'
|
||||||
|
'then exec python%(pyver)s -c "'
|
||||||
|
"import sys, os; sys.path.insert(0, os.path.abspath('$0')); "
|
||||||
|
"from %(pkg)s import %(base)s; sys.exit(%(full)s())"
|
||||||
|
'" "$@"\n'
|
||||||
|
'else\n'
|
||||||
|
' echo $0 is not the correct name for this egg file.\n'
|
||||||
|
' echo Please rename it back to %(basename)s and try again.\n'
|
||||||
|
' exec false\n'
|
||||||
|
'fi\n'
|
||||||
|
) % locals()
|
||||||
|
|
||||||
|
if not self.dry_run:
|
||||||
|
mkpath(os.path.dirname(self.egg_output), dry_run=self.dry_run)
|
||||||
|
f = open(self.egg_output, 'w')
|
||||||
|
f.write(header)
|
||||||
|
f.close()
|
||||||
|
return 'a'
|
||||||
|
|
||||||
|
def copy_metadata_to(self, target_dir):
|
||||||
|
"Copy metadata (egg info) to the target_dir"
|
||||||
|
# normalize the path (so that a forward-slash in egg_info will
|
||||||
|
# match using startswith below)
|
||||||
|
norm_egg_info = os.path.normpath(self.egg_info)
|
||||||
|
prefix = os.path.join(norm_egg_info, '')
|
||||||
|
for path in self.ei_cmd.filelist.files:
|
||||||
|
if path.startswith(prefix):
|
||||||
|
target = os.path.join(target_dir, path[len(prefix):])
|
||||||
|
ensure_directory(target)
|
||||||
|
self.copy_file(path, target)
|
||||||
|
|
||||||
|
def get_ext_outputs(self):
|
||||||
|
"""Get a list of relative paths to C extensions in the output distro"""
|
||||||
|
|
||||||
|
all_outputs = []
|
||||||
|
ext_outputs = []
|
||||||
|
|
||||||
|
paths = {self.bdist_dir: ''}
|
||||||
|
for base, dirs, files in sorted_walk(self.bdist_dir):
|
||||||
|
for filename in files:
|
||||||
|
if os.path.splitext(filename)[1].lower() in NATIVE_EXTENSIONS:
|
||||||
|
all_outputs.append(paths[base] + filename)
|
||||||
|
for filename in dirs:
|
||||||
|
paths[os.path.join(base, filename)] = (paths[base] +
|
||||||
|
filename + '/')
|
||||||
|
|
||||||
|
if self.distribution.has_ext_modules():
|
||||||
|
build_cmd = self.get_finalized_command('build_ext')
|
||||||
|
for ext in build_cmd.extensions:
|
||||||
|
if isinstance(ext, Library):
|
||||||
|
continue
|
||||||
|
fullname = build_cmd.get_ext_fullname(ext.name)
|
||||||
|
filename = build_cmd.get_ext_filename(fullname)
|
||||||
|
if not os.path.basename(filename).startswith('dl-'):
|
||||||
|
if os.path.exists(os.path.join(self.bdist_dir, filename)):
|
||||||
|
ext_outputs.append(filename)
|
||||||
|
|
||||||
|
return all_outputs, ext_outputs
|
||||||
|
|
||||||
|
|
||||||
|
NATIVE_EXTENSIONS = dict.fromkeys('.dll .so .dylib .pyd'.split())
|
||||||
|
|
||||||
|
|
||||||
|
def walk_egg(egg_dir):
|
||||||
|
"""Walk an unpacked egg's contents, skipping the metadata directory"""
|
||||||
|
walker = sorted_walk(egg_dir)
|
||||||
|
base, dirs, files = next(walker)
|
||||||
|
if 'EGG-INFO' in dirs:
|
||||||
|
dirs.remove('EGG-INFO')
|
||||||
|
yield base, dirs, files
|
||||||
|
for bdf in walker:
|
||||||
|
yield bdf
|
||||||
|
|
||||||
|
|
||||||
|
def analyze_egg(egg_dir, stubs):
|
||||||
|
# check for existing flag in EGG-INFO
|
||||||
|
for flag, fn in safety_flags.items():
|
||||||
|
if os.path.exists(os.path.join(egg_dir, 'EGG-INFO', fn)):
|
||||||
|
return flag
|
||||||
|
if not can_scan():
|
||||||
|
return False
|
||||||
|
safe = True
|
||||||
|
for base, dirs, files in walk_egg(egg_dir):
|
||||||
|
for name in files:
|
||||||
|
if name.endswith('.py') or name.endswith('.pyw'):
|
||||||
|
continue
|
||||||
|
elif name.endswith('.pyc') or name.endswith('.pyo'):
|
||||||
|
# always scan, even if we already know we're not safe
|
||||||
|
safe = scan_module(egg_dir, base, name, stubs) and safe
|
||||||
|
return safe
|
||||||
|
|
||||||
|
|
||||||
|
def write_safety_flag(egg_dir, safe):
|
||||||
|
# Write or remove zip safety flag file(s)
|
||||||
|
for flag, fn in safety_flags.items():
|
||||||
|
fn = os.path.join(egg_dir, fn)
|
||||||
|
if os.path.exists(fn):
|
||||||
|
if safe is None or bool(safe) != flag:
|
||||||
|
os.unlink(fn)
|
||||||
|
elif safe is not None and bool(safe) == flag:
|
||||||
|
f = open(fn, 'wt')
|
||||||
|
f.write('\n')
|
||||||
|
f.close()
|
||||||
|
|
||||||
|
|
||||||
|
safety_flags = {
|
||||||
|
True: 'zip-safe',
|
||||||
|
False: 'not-zip-safe',
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def scan_module(egg_dir, base, name, stubs):
|
||||||
|
"""Check whether module possibly uses unsafe-for-zipfile stuff"""
|
||||||
|
|
||||||
|
filename = os.path.join(base, name)
|
||||||
|
if filename[:-1] in stubs:
|
||||||
|
return True # Extension module
|
||||||
|
pkg = base[len(egg_dir) + 1:].replace(os.sep, '.')
|
||||||
|
module = pkg + (pkg and '.' or '') + os.path.splitext(name)[0]
|
||||||
|
if sys.version_info < (3, 3):
|
||||||
|
skip = 8 # skip magic & date
|
||||||
|
elif sys.version_info < (3, 7):
|
||||||
|
skip = 12 # skip magic & date & file size
|
||||||
|
else:
|
||||||
|
skip = 16 # skip magic & reserved? & date & file size
|
||||||
|
f = open(filename, 'rb')
|
||||||
|
f.read(skip)
|
||||||
|
code = marshal.load(f)
|
||||||
|
f.close()
|
||||||
|
safe = True
|
||||||
|
symbols = dict.fromkeys(iter_symbols(code))
|
||||||
|
for bad in ['__file__', '__path__']:
|
||||||
|
if bad in symbols:
|
||||||
|
log.warn("%s: module references %s", module, bad)
|
||||||
|
safe = False
|
||||||
|
if 'inspect' in symbols:
|
||||||
|
for bad in [
|
||||||
|
'getsource', 'getabsfile', 'getsourcefile', 'getfile'
|
||||||
|
'getsourcelines', 'findsource', 'getcomments', 'getframeinfo',
|
||||||
|
'getinnerframes', 'getouterframes', 'stack', 'trace'
|
||||||
|
]:
|
||||||
|
if bad in symbols:
|
||||||
|
log.warn("%s: module MAY be using inspect.%s", module, bad)
|
||||||
|
safe = False
|
||||||
|
return safe
|
||||||
|
|
||||||
|
|
||||||
|
def iter_symbols(code):
|
||||||
|
"""Yield names and strings used by `code` and its nested code objects"""
|
||||||
|
for name in code.co_names:
|
||||||
|
yield name
|
||||||
|
for const in code.co_consts:
|
||||||
|
if isinstance(const, six.string_types):
|
||||||
|
yield const
|
||||||
|
elif isinstance(const, CodeType):
|
||||||
|
for name in iter_symbols(const):
|
||||||
|
yield name
|
||||||
|
|
||||||
|
|
||||||
|
def can_scan():
|
||||||
|
if not sys.platform.startswith('java') and sys.platform != 'cli':
|
||||||
|
# CPython, PyPy, etc.
|
||||||
|
return True
|
||||||
|
log.warn("Unable to analyze compiled code on this platform.")
|
||||||
|
log.warn("Please ask the author to include a 'zip_safe'"
|
||||||
|
" setting (either True or False) in the package's setup.py")
|
||||||
|
|
||||||
|
|
||||||
|
# Attribute names of options for commands that might need to be convinced to
|
||||||
|
# install to the egg build directory
|
||||||
|
|
||||||
|
INSTALL_DIRECTORY_ATTRS = [
|
||||||
|
'install_lib', 'install_dir', 'install_data', 'install_base'
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
def make_zipfile(zip_filename, base_dir, verbose=0, dry_run=0, compress=True,
|
||||||
|
mode='w'):
|
||||||
|
"""Create a zip file from all the files under 'base_dir'. The output
|
||||||
|
zip file will be named 'base_dir' + ".zip". Uses either the "zipfile"
|
||||||
|
Python module (if available) or the InfoZIP "zip" utility (if installed
|
||||||
|
and found on the default search path). If neither tool is available,
|
||||||
|
raises DistutilsExecError. Returns the name of the output zip file.
|
||||||
|
"""
|
||||||
|
import zipfile
|
||||||
|
|
||||||
|
mkpath(os.path.dirname(zip_filename), dry_run=dry_run)
|
||||||
|
log.info("creating '%s' and adding '%s' to it", zip_filename, base_dir)
|
||||||
|
|
||||||
|
def visit(z, dirname, names):
|
||||||
|
for name in names:
|
||||||
|
path = os.path.normpath(os.path.join(dirname, name))
|
||||||
|
if os.path.isfile(path):
|
||||||
|
p = path[len(base_dir) + 1:]
|
||||||
|
if not dry_run:
|
||||||
|
z.write(path, p)
|
||||||
|
log.debug("adding '%s'", p)
|
||||||
|
|
||||||
|
compression = zipfile.ZIP_DEFLATED if compress else zipfile.ZIP_STORED
|
||||||
|
if not dry_run:
|
||||||
|
z = zipfile.ZipFile(zip_filename, mode, compression=compression)
|
||||||
|
for dirname, dirs, files in sorted_walk(base_dir):
|
||||||
|
visit(z, dirname, files)
|
||||||
|
z.close()
|
||||||
|
else:
|
||||||
|
for dirname, dirs, files in sorted_walk(base_dir):
|
||||||
|
visit(None, dirname, files)
|
||||||
|
return zip_filename
|
||||||
@@ -0,0 +1,43 @@
|
|||||||
|
import distutils.command.bdist_rpm as orig
|
||||||
|
|
||||||
|
|
||||||
|
class bdist_rpm(orig.bdist_rpm):
|
||||||
|
"""
|
||||||
|
Override the default bdist_rpm behavior to do the following:
|
||||||
|
|
||||||
|
1. Run egg_info to ensure the name and version are properly calculated.
|
||||||
|
2. Always run 'install' using --single-version-externally-managed to
|
||||||
|
disable eggs in RPM distributions.
|
||||||
|
3. Replace dash with underscore in the version numbers for better RPM
|
||||||
|
compatibility.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
# ensure distro name is up-to-date
|
||||||
|
self.run_command('egg_info')
|
||||||
|
|
||||||
|
orig.bdist_rpm.run(self)
|
||||||
|
|
||||||
|
def _make_spec_file(self):
|
||||||
|
version = self.distribution.get_version()
|
||||||
|
rpmversion = version.replace('-', '_')
|
||||||
|
spec = orig.bdist_rpm._make_spec_file(self)
|
||||||
|
line23 = '%define version ' + version
|
||||||
|
line24 = '%define version ' + rpmversion
|
||||||
|
spec = [
|
||||||
|
line.replace(
|
||||||
|
"Source0: %{name}-%{version}.tar",
|
||||||
|
"Source0: %{name}-%{unmangled_version}.tar"
|
||||||
|
).replace(
|
||||||
|
"setup.py install ",
|
||||||
|
"setup.py install --single-version-externally-managed "
|
||||||
|
).replace(
|
||||||
|
"%setup",
|
||||||
|
"%setup -n %{name}-%{unmangled_version}"
|
||||||
|
).replace(line23, line24)
|
||||||
|
for line in spec
|
||||||
|
]
|
||||||
|
insert_loc = spec.index(line24) + 1
|
||||||
|
unmangled_version = "%define unmangled_version " + version
|
||||||
|
spec.insert(insert_loc, unmangled_version)
|
||||||
|
return spec
|
||||||
@@ -0,0 +1,21 @@
|
|||||||
|
import distutils.command.bdist_wininst as orig
|
||||||
|
|
||||||
|
|
||||||
|
class bdist_wininst(orig.bdist_wininst):
|
||||||
|
def reinitialize_command(self, command, reinit_subcommands=0):
|
||||||
|
"""
|
||||||
|
Supplement reinitialize_command to work around
|
||||||
|
http://bugs.python.org/issue20819
|
||||||
|
"""
|
||||||
|
cmd = self.distribution.reinitialize_command(
|
||||||
|
command, reinit_subcommands)
|
||||||
|
if command in ('install', 'install_lib'):
|
||||||
|
cmd.install_lib = None
|
||||||
|
return cmd
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
self._is_running = True
|
||||||
|
try:
|
||||||
|
orig.bdist_wininst.run(self)
|
||||||
|
finally:
|
||||||
|
self._is_running = False
|
||||||
@@ -0,0 +1,98 @@
|
|||||||
|
import distutils.command.build_clib as orig
|
||||||
|
from distutils.errors import DistutilsSetupError
|
||||||
|
from distutils import log
|
||||||
|
from setuptools.dep_util import newer_pairwise_group
|
||||||
|
|
||||||
|
|
||||||
|
class build_clib(orig.build_clib):
|
||||||
|
"""
|
||||||
|
Override the default build_clib behaviour to do the following:
|
||||||
|
|
||||||
|
1. Implement a rudimentary timestamp-based dependency system
|
||||||
|
so 'compile()' doesn't run every time.
|
||||||
|
2. Add more keys to the 'build_info' dictionary:
|
||||||
|
* obj_deps - specify dependencies for each object compiled.
|
||||||
|
this should be a dictionary mapping a key
|
||||||
|
with the source filename to a list of
|
||||||
|
dependencies. Use an empty string for global
|
||||||
|
dependencies.
|
||||||
|
* cflags - specify a list of additional flags to pass to
|
||||||
|
the compiler.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def build_libraries(self, libraries):
|
||||||
|
for (lib_name, build_info) in libraries:
|
||||||
|
sources = build_info.get('sources')
|
||||||
|
if sources is None or not isinstance(sources, (list, tuple)):
|
||||||
|
raise DistutilsSetupError(
|
||||||
|
"in 'libraries' option (library '%s'), "
|
||||||
|
"'sources' must be present and must be "
|
||||||
|
"a list of source filenames" % lib_name)
|
||||||
|
sources = list(sources)
|
||||||
|
|
||||||
|
log.info("building '%s' library", lib_name)
|
||||||
|
|
||||||
|
# Make sure everything is the correct type.
|
||||||
|
# obj_deps should be a dictionary of keys as sources
|
||||||
|
# and a list/tuple of files that are its dependencies.
|
||||||
|
obj_deps = build_info.get('obj_deps', dict())
|
||||||
|
if not isinstance(obj_deps, dict):
|
||||||
|
raise DistutilsSetupError(
|
||||||
|
"in 'libraries' option (library '%s'), "
|
||||||
|
"'obj_deps' must be a dictionary of "
|
||||||
|
"type 'source: list'" % lib_name)
|
||||||
|
dependencies = []
|
||||||
|
|
||||||
|
# Get the global dependencies that are specified by the '' key.
|
||||||
|
# These will go into every source's dependency list.
|
||||||
|
global_deps = obj_deps.get('', list())
|
||||||
|
if not isinstance(global_deps, (list, tuple)):
|
||||||
|
raise DistutilsSetupError(
|
||||||
|
"in 'libraries' option (library '%s'), "
|
||||||
|
"'obj_deps' must be a dictionary of "
|
||||||
|
"type 'source: list'" % lib_name)
|
||||||
|
|
||||||
|
# Build the list to be used by newer_pairwise_group
|
||||||
|
# each source will be auto-added to its dependencies.
|
||||||
|
for source in sources:
|
||||||
|
src_deps = [source]
|
||||||
|
src_deps.extend(global_deps)
|
||||||
|
extra_deps = obj_deps.get(source, list())
|
||||||
|
if not isinstance(extra_deps, (list, tuple)):
|
||||||
|
raise DistutilsSetupError(
|
||||||
|
"in 'libraries' option (library '%s'), "
|
||||||
|
"'obj_deps' must be a dictionary of "
|
||||||
|
"type 'source: list'" % lib_name)
|
||||||
|
src_deps.extend(extra_deps)
|
||||||
|
dependencies.append(src_deps)
|
||||||
|
|
||||||
|
expected_objects = self.compiler.object_filenames(
|
||||||
|
sources,
|
||||||
|
output_dir=self.build_temp
|
||||||
|
)
|
||||||
|
|
||||||
|
if newer_pairwise_group(dependencies, expected_objects) != ([], []):
|
||||||
|
# First, compile the source code to object files in the library
|
||||||
|
# directory. (This should probably change to putting object
|
||||||
|
# files in a temporary build directory.)
|
||||||
|
macros = build_info.get('macros')
|
||||||
|
include_dirs = build_info.get('include_dirs')
|
||||||
|
cflags = build_info.get('cflags')
|
||||||
|
objects = self.compiler.compile(
|
||||||
|
sources,
|
||||||
|
output_dir=self.build_temp,
|
||||||
|
macros=macros,
|
||||||
|
include_dirs=include_dirs,
|
||||||
|
extra_postargs=cflags,
|
||||||
|
debug=self.debug
|
||||||
|
)
|
||||||
|
|
||||||
|
# Now "link" the object files together into a static library.
|
||||||
|
# (On Unix at least, this isn't really linking -- it just
|
||||||
|
# builds an archive. Whatever.)
|
||||||
|
self.compiler.create_static_lib(
|
||||||
|
expected_objects,
|
||||||
|
lib_name,
|
||||||
|
output_dir=self.build_clib,
|
||||||
|
debug=self.debug
|
||||||
|
)
|
||||||
@@ -0,0 +1,331 @@
|
|||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import itertools
|
||||||
|
import imp
|
||||||
|
from distutils.command.build_ext import build_ext as _du_build_ext
|
||||||
|
from distutils.file_util import copy_file
|
||||||
|
from distutils.ccompiler import new_compiler
|
||||||
|
from distutils.sysconfig import customize_compiler, get_config_var
|
||||||
|
from distutils.errors import DistutilsError
|
||||||
|
from distutils import log
|
||||||
|
|
||||||
|
from setuptools.extension import Library
|
||||||
|
from setuptools.extern import six
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Attempt to use Cython for building extensions, if available
|
||||||
|
from Cython.Distutils.build_ext import build_ext as _build_ext
|
||||||
|
# Additionally, assert that the compiler module will load
|
||||||
|
# also. Ref #1229.
|
||||||
|
__import__('Cython.Compiler.Main')
|
||||||
|
except ImportError:
|
||||||
|
_build_ext = _du_build_ext
|
||||||
|
|
||||||
|
# make sure _config_vars is initialized
|
||||||
|
get_config_var("LDSHARED")
|
||||||
|
from distutils.sysconfig import _config_vars as _CONFIG_VARS
|
||||||
|
|
||||||
|
|
||||||
|
def _customize_compiler_for_shlib(compiler):
|
||||||
|
if sys.platform == "darwin":
|
||||||
|
# building .dylib requires additional compiler flags on OSX; here we
|
||||||
|
# temporarily substitute the pyconfig.h variables so that distutils'
|
||||||
|
# 'customize_compiler' uses them before we build the shared libraries.
|
||||||
|
tmp = _CONFIG_VARS.copy()
|
||||||
|
try:
|
||||||
|
# XXX Help! I don't have any idea whether these are right...
|
||||||
|
_CONFIG_VARS['LDSHARED'] = (
|
||||||
|
"gcc -Wl,-x -dynamiclib -undefined dynamic_lookup")
|
||||||
|
_CONFIG_VARS['CCSHARED'] = " -dynamiclib"
|
||||||
|
_CONFIG_VARS['SO'] = ".dylib"
|
||||||
|
customize_compiler(compiler)
|
||||||
|
finally:
|
||||||
|
_CONFIG_VARS.clear()
|
||||||
|
_CONFIG_VARS.update(tmp)
|
||||||
|
else:
|
||||||
|
customize_compiler(compiler)
|
||||||
|
|
||||||
|
|
||||||
|
have_rtld = False
|
||||||
|
use_stubs = False
|
||||||
|
libtype = 'shared'
|
||||||
|
|
||||||
|
if sys.platform == "darwin":
|
||||||
|
use_stubs = True
|
||||||
|
elif os.name != 'nt':
|
||||||
|
try:
|
||||||
|
import dl
|
||||||
|
use_stubs = have_rtld = hasattr(dl, 'RTLD_NOW')
|
||||||
|
except ImportError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
if_dl = lambda s: s if have_rtld else ''
|
||||||
|
|
||||||
|
|
||||||
|
def get_abi3_suffix():
|
||||||
|
"""Return the file extension for an abi3-compliant Extension()"""
|
||||||
|
for suffix, _, _ in (s for s in imp.get_suffixes() if s[2] == imp.C_EXTENSION):
|
||||||
|
if '.abi3' in suffix: # Unix
|
||||||
|
return suffix
|
||||||
|
elif suffix == '.pyd': # Windows
|
||||||
|
return suffix
|
||||||
|
|
||||||
|
|
||||||
|
class build_ext(_build_ext):
|
||||||
|
def run(self):
|
||||||
|
"""Build extensions in build directory, then copy if --inplace"""
|
||||||
|
old_inplace, self.inplace = self.inplace, 0
|
||||||
|
_build_ext.run(self)
|
||||||
|
self.inplace = old_inplace
|
||||||
|
if old_inplace:
|
||||||
|
self.copy_extensions_to_source()
|
||||||
|
|
||||||
|
def copy_extensions_to_source(self):
|
||||||
|
build_py = self.get_finalized_command('build_py')
|
||||||
|
for ext in self.extensions:
|
||||||
|
fullname = self.get_ext_fullname(ext.name)
|
||||||
|
filename = self.get_ext_filename(fullname)
|
||||||
|
modpath = fullname.split('.')
|
||||||
|
package = '.'.join(modpath[:-1])
|
||||||
|
package_dir = build_py.get_package_dir(package)
|
||||||
|
dest_filename = os.path.join(package_dir,
|
||||||
|
os.path.basename(filename))
|
||||||
|
src_filename = os.path.join(self.build_lib, filename)
|
||||||
|
|
||||||
|
# Always copy, even if source is older than destination, to ensure
|
||||||
|
# that the right extensions for the current Python/platform are
|
||||||
|
# used.
|
||||||
|
copy_file(
|
||||||
|
src_filename, dest_filename, verbose=self.verbose,
|
||||||
|
dry_run=self.dry_run
|
||||||
|
)
|
||||||
|
if ext._needs_stub:
|
||||||
|
self.write_stub(package_dir or os.curdir, ext, True)
|
||||||
|
|
||||||
|
def get_ext_filename(self, fullname):
|
||||||
|
filename = _build_ext.get_ext_filename(self, fullname)
|
||||||
|
if fullname in self.ext_map:
|
||||||
|
ext = self.ext_map[fullname]
|
||||||
|
use_abi3 = (
|
||||||
|
six.PY3
|
||||||
|
and getattr(ext, 'py_limited_api')
|
||||||
|
and get_abi3_suffix()
|
||||||
|
)
|
||||||
|
if use_abi3:
|
||||||
|
so_ext = _get_config_var_837('EXT_SUFFIX')
|
||||||
|
filename = filename[:-len(so_ext)]
|
||||||
|
filename = filename + get_abi3_suffix()
|
||||||
|
if isinstance(ext, Library):
|
||||||
|
fn, ext = os.path.splitext(filename)
|
||||||
|
return self.shlib_compiler.library_filename(fn, libtype)
|
||||||
|
elif use_stubs and ext._links_to_dynamic:
|
||||||
|
d, fn = os.path.split(filename)
|
||||||
|
return os.path.join(d, 'dl-' + fn)
|
||||||
|
return filename
|
||||||
|
|
||||||
|
def initialize_options(self):
|
||||||
|
_build_ext.initialize_options(self)
|
||||||
|
self.shlib_compiler = None
|
||||||
|
self.shlibs = []
|
||||||
|
self.ext_map = {}
|
||||||
|
|
||||||
|
def finalize_options(self):
|
||||||
|
_build_ext.finalize_options(self)
|
||||||
|
self.extensions = self.extensions or []
|
||||||
|
self.check_extensions_list(self.extensions)
|
||||||
|
self.shlibs = [ext for ext in self.extensions
|
||||||
|
if isinstance(ext, Library)]
|
||||||
|
if self.shlibs:
|
||||||
|
self.setup_shlib_compiler()
|
||||||
|
for ext in self.extensions:
|
||||||
|
ext._full_name = self.get_ext_fullname(ext.name)
|
||||||
|
for ext in self.extensions:
|
||||||
|
fullname = ext._full_name
|
||||||
|
self.ext_map[fullname] = ext
|
||||||
|
|
||||||
|
# distutils 3.1 will also ask for module names
|
||||||
|
# XXX what to do with conflicts?
|
||||||
|
self.ext_map[fullname.split('.')[-1]] = ext
|
||||||
|
|
||||||
|
ltd = self.shlibs and self.links_to_dynamic(ext) or False
|
||||||
|
ns = ltd and use_stubs and not isinstance(ext, Library)
|
||||||
|
ext._links_to_dynamic = ltd
|
||||||
|
ext._needs_stub = ns
|
||||||
|
filename = ext._file_name = self.get_ext_filename(fullname)
|
||||||
|
libdir = os.path.dirname(os.path.join(self.build_lib, filename))
|
||||||
|
if ltd and libdir not in ext.library_dirs:
|
||||||
|
ext.library_dirs.append(libdir)
|
||||||
|
if ltd and use_stubs and os.curdir not in ext.runtime_library_dirs:
|
||||||
|
ext.runtime_library_dirs.append(os.curdir)
|
||||||
|
|
||||||
|
def setup_shlib_compiler(self):
|
||||||
|
compiler = self.shlib_compiler = new_compiler(
|
||||||
|
compiler=self.compiler, dry_run=self.dry_run, force=self.force
|
||||||
|
)
|
||||||
|
_customize_compiler_for_shlib(compiler)
|
||||||
|
|
||||||
|
if self.include_dirs is not None:
|
||||||
|
compiler.set_include_dirs(self.include_dirs)
|
||||||
|
if self.define is not None:
|
||||||
|
# 'define' option is a list of (name,value) tuples
|
||||||
|
for (name, value) in self.define:
|
||||||
|
compiler.define_macro(name, value)
|
||||||
|
if self.undef is not None:
|
||||||
|
for macro in self.undef:
|
||||||
|
compiler.undefine_macro(macro)
|
||||||
|
if self.libraries is not None:
|
||||||
|
compiler.set_libraries(self.libraries)
|
||||||
|
if self.library_dirs is not None:
|
||||||
|
compiler.set_library_dirs(self.library_dirs)
|
||||||
|
if self.rpath is not None:
|
||||||
|
compiler.set_runtime_library_dirs(self.rpath)
|
||||||
|
if self.link_objects is not None:
|
||||||
|
compiler.set_link_objects(self.link_objects)
|
||||||
|
|
||||||
|
# hack so distutils' build_extension() builds a library instead
|
||||||
|
compiler.link_shared_object = link_shared_object.__get__(compiler)
|
||||||
|
|
||||||
|
def get_export_symbols(self, ext):
|
||||||
|
if isinstance(ext, Library):
|
||||||
|
return ext.export_symbols
|
||||||
|
return _build_ext.get_export_symbols(self, ext)
|
||||||
|
|
||||||
|
def build_extension(self, ext):
|
||||||
|
ext._convert_pyx_sources_to_lang()
|
||||||
|
_compiler = self.compiler
|
||||||
|
try:
|
||||||
|
if isinstance(ext, Library):
|
||||||
|
self.compiler = self.shlib_compiler
|
||||||
|
_build_ext.build_extension(self, ext)
|
||||||
|
if ext._needs_stub:
|
||||||
|
cmd = self.get_finalized_command('build_py').build_lib
|
||||||
|
self.write_stub(cmd, ext)
|
||||||
|
finally:
|
||||||
|
self.compiler = _compiler
|
||||||
|
|
||||||
|
def links_to_dynamic(self, ext):
|
||||||
|
"""Return true if 'ext' links to a dynamic lib in the same package"""
|
||||||
|
# XXX this should check to ensure the lib is actually being built
|
||||||
|
# XXX as dynamic, and not just using a locally-found version or a
|
||||||
|
# XXX static-compiled version
|
||||||
|
libnames = dict.fromkeys([lib._full_name for lib in self.shlibs])
|
||||||
|
pkg = '.'.join(ext._full_name.split('.')[:-1] + [''])
|
||||||
|
return any(pkg + libname in libnames for libname in ext.libraries)
|
||||||
|
|
||||||
|
def get_outputs(self):
|
||||||
|
return _build_ext.get_outputs(self) + self.__get_stubs_outputs()
|
||||||
|
|
||||||
|
def __get_stubs_outputs(self):
|
||||||
|
# assemble the base name for each extension that needs a stub
|
||||||
|
ns_ext_bases = (
|
||||||
|
os.path.join(self.build_lib, *ext._full_name.split('.'))
|
||||||
|
for ext in self.extensions
|
||||||
|
if ext._needs_stub
|
||||||
|
)
|
||||||
|
# pair each base with the extension
|
||||||
|
pairs = itertools.product(ns_ext_bases, self.__get_output_extensions())
|
||||||
|
return list(base + fnext for base, fnext in pairs)
|
||||||
|
|
||||||
|
def __get_output_extensions(self):
|
||||||
|
yield '.py'
|
||||||
|
yield '.pyc'
|
||||||
|
if self.get_finalized_command('build_py').optimize:
|
||||||
|
yield '.pyo'
|
||||||
|
|
||||||
|
def write_stub(self, output_dir, ext, compile=False):
|
||||||
|
log.info("writing stub loader for %s to %s", ext._full_name,
|
||||||
|
output_dir)
|
||||||
|
stub_file = (os.path.join(output_dir, *ext._full_name.split('.')) +
|
||||||
|
'.py')
|
||||||
|
if compile and os.path.exists(stub_file):
|
||||||
|
raise DistutilsError(stub_file + " already exists! Please delete.")
|
||||||
|
if not self.dry_run:
|
||||||
|
f = open(stub_file, 'w')
|
||||||
|
f.write(
|
||||||
|
'\n'.join([
|
||||||
|
"def __bootstrap__():",
|
||||||
|
" global __bootstrap__, __file__, __loader__",
|
||||||
|
" import sys, os, pkg_resources, imp" + if_dl(", dl"),
|
||||||
|
" __file__ = pkg_resources.resource_filename"
|
||||||
|
"(__name__,%r)"
|
||||||
|
% os.path.basename(ext._file_name),
|
||||||
|
" del __bootstrap__",
|
||||||
|
" if '__loader__' in globals():",
|
||||||
|
" del __loader__",
|
||||||
|
if_dl(" old_flags = sys.getdlopenflags()"),
|
||||||
|
" old_dir = os.getcwd()",
|
||||||
|
" try:",
|
||||||
|
" os.chdir(os.path.dirname(__file__))",
|
||||||
|
if_dl(" sys.setdlopenflags(dl.RTLD_NOW)"),
|
||||||
|
" imp.load_dynamic(__name__,__file__)",
|
||||||
|
" finally:",
|
||||||
|
if_dl(" sys.setdlopenflags(old_flags)"),
|
||||||
|
" os.chdir(old_dir)",
|
||||||
|
"__bootstrap__()",
|
||||||
|
"" # terminal \n
|
||||||
|
])
|
||||||
|
)
|
||||||
|
f.close()
|
||||||
|
if compile:
|
||||||
|
from distutils.util import byte_compile
|
||||||
|
|
||||||
|
byte_compile([stub_file], optimize=0,
|
||||||
|
force=True, dry_run=self.dry_run)
|
||||||
|
optimize = self.get_finalized_command('install_lib').optimize
|
||||||
|
if optimize > 0:
|
||||||
|
byte_compile([stub_file], optimize=optimize,
|
||||||
|
force=True, dry_run=self.dry_run)
|
||||||
|
if os.path.exists(stub_file) and not self.dry_run:
|
||||||
|
os.unlink(stub_file)
|
||||||
|
|
||||||
|
|
||||||
|
if use_stubs or os.name == 'nt':
|
||||||
|
# Build shared libraries
|
||||||
|
#
|
||||||
|
def link_shared_object(
|
||||||
|
self, objects, output_libname, output_dir=None, libraries=None,
|
||||||
|
library_dirs=None, runtime_library_dirs=None, export_symbols=None,
|
||||||
|
debug=0, extra_preargs=None, extra_postargs=None, build_temp=None,
|
||||||
|
target_lang=None):
|
||||||
|
self.link(
|
||||||
|
self.SHARED_LIBRARY, objects, output_libname,
|
||||||
|
output_dir, libraries, library_dirs, runtime_library_dirs,
|
||||||
|
export_symbols, debug, extra_preargs, extra_postargs,
|
||||||
|
build_temp, target_lang
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
# Build static libraries everywhere else
|
||||||
|
libtype = 'static'
|
||||||
|
|
||||||
|
def link_shared_object(
|
||||||
|
self, objects, output_libname, output_dir=None, libraries=None,
|
||||||
|
library_dirs=None, runtime_library_dirs=None, export_symbols=None,
|
||||||
|
debug=0, extra_preargs=None, extra_postargs=None, build_temp=None,
|
||||||
|
target_lang=None):
|
||||||
|
# XXX we need to either disallow these attrs on Library instances,
|
||||||
|
# or warn/abort here if set, or something...
|
||||||
|
# libraries=None, library_dirs=None, runtime_library_dirs=None,
|
||||||
|
# export_symbols=None, extra_preargs=None, extra_postargs=None,
|
||||||
|
# build_temp=None
|
||||||
|
|
||||||
|
assert output_dir is None # distutils build_ext doesn't pass this
|
||||||
|
output_dir, filename = os.path.split(output_libname)
|
||||||
|
basename, ext = os.path.splitext(filename)
|
||||||
|
if self.library_filename("x").startswith('lib'):
|
||||||
|
# strip 'lib' prefix; this is kludgy if some platform uses
|
||||||
|
# a different prefix
|
||||||
|
basename = basename[3:]
|
||||||
|
|
||||||
|
self.create_static_lib(
|
||||||
|
objects, basename, output_dir, debug, target_lang
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _get_config_var_837(name):
|
||||||
|
"""
|
||||||
|
In https://github.com/pypa/setuptools/pull/837, we discovered
|
||||||
|
Python 3.3.0 exposes the extension suffix under the name 'SO'.
|
||||||
|
"""
|
||||||
|
if sys.version_info < (3, 3, 1):
|
||||||
|
name = 'SO'
|
||||||
|
return get_config_var(name)
|
||||||
@@ -0,0 +1,270 @@
|
|||||||
|
from glob import glob
|
||||||
|
from distutils.util import convert_path
|
||||||
|
import distutils.command.build_py as orig
|
||||||
|
import os
|
||||||
|
import fnmatch
|
||||||
|
import textwrap
|
||||||
|
import io
|
||||||
|
import distutils.errors
|
||||||
|
import itertools
|
||||||
|
|
||||||
|
from setuptools.extern import six
|
||||||
|
from setuptools.extern.six.moves import map, filter, filterfalse
|
||||||
|
|
||||||
|
try:
|
||||||
|
from setuptools.lib2to3_ex import Mixin2to3
|
||||||
|
except ImportError:
|
||||||
|
|
||||||
|
class Mixin2to3:
|
||||||
|
def run_2to3(self, files, doctests=True):
|
||||||
|
"do nothing"
|
||||||
|
|
||||||
|
|
||||||
|
class build_py(orig.build_py, Mixin2to3):
|
||||||
|
"""Enhanced 'build_py' command that includes data files with packages
|
||||||
|
|
||||||
|
The data files are specified via a 'package_data' argument to 'setup()'.
|
||||||
|
See 'setuptools.dist.Distribution' for more details.
|
||||||
|
|
||||||
|
Also, this version of the 'build_py' command allows you to specify both
|
||||||
|
'py_modules' and 'packages' in the same setup operation.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def finalize_options(self):
|
||||||
|
orig.build_py.finalize_options(self)
|
||||||
|
self.package_data = self.distribution.package_data
|
||||||
|
self.exclude_package_data = (self.distribution.exclude_package_data or
|
||||||
|
{})
|
||||||
|
if 'data_files' in self.__dict__:
|
||||||
|
del self.__dict__['data_files']
|
||||||
|
self.__updated_files = []
|
||||||
|
self.__doctests_2to3 = []
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
"""Build modules, packages, and copy data files to build directory"""
|
||||||
|
if not self.py_modules and not self.packages:
|
||||||
|
return
|
||||||
|
|
||||||
|
if self.py_modules:
|
||||||
|
self.build_modules()
|
||||||
|
|
||||||
|
if self.packages:
|
||||||
|
self.build_packages()
|
||||||
|
self.build_package_data()
|
||||||
|
|
||||||
|
self.run_2to3(self.__updated_files, False)
|
||||||
|
self.run_2to3(self.__updated_files, True)
|
||||||
|
self.run_2to3(self.__doctests_2to3, True)
|
||||||
|
|
||||||
|
# Only compile actual .py files, using our base class' idea of what our
|
||||||
|
# output files are.
|
||||||
|
self.byte_compile(orig.build_py.get_outputs(self, include_bytecode=0))
|
||||||
|
|
||||||
|
def __getattr__(self, attr):
|
||||||
|
"lazily compute data files"
|
||||||
|
if attr == 'data_files':
|
||||||
|
self.data_files = self._get_data_files()
|
||||||
|
return self.data_files
|
||||||
|
return orig.build_py.__getattr__(self, attr)
|
||||||
|
|
||||||
|
def build_module(self, module, module_file, package):
|
||||||
|
if six.PY2 and isinstance(package, six.string_types):
|
||||||
|
# avoid errors on Python 2 when unicode is passed (#190)
|
||||||
|
package = package.split('.')
|
||||||
|
outfile, copied = orig.build_py.build_module(self, module, module_file,
|
||||||
|
package)
|
||||||
|
if copied:
|
||||||
|
self.__updated_files.append(outfile)
|
||||||
|
return outfile, copied
|
||||||
|
|
||||||
|
def _get_data_files(self):
|
||||||
|
"""Generate list of '(package,src_dir,build_dir,filenames)' tuples"""
|
||||||
|
self.analyze_manifest()
|
||||||
|
return list(map(self._get_pkg_data_files, self.packages or ()))
|
||||||
|
|
||||||
|
def _get_pkg_data_files(self, package):
|
||||||
|
# Locate package source directory
|
||||||
|
src_dir = self.get_package_dir(package)
|
||||||
|
|
||||||
|
# Compute package build directory
|
||||||
|
build_dir = os.path.join(*([self.build_lib] + package.split('.')))
|
||||||
|
|
||||||
|
# Strip directory from globbed filenames
|
||||||
|
filenames = [
|
||||||
|
os.path.relpath(file, src_dir)
|
||||||
|
for file in self.find_data_files(package, src_dir)
|
||||||
|
]
|
||||||
|
return package, src_dir, build_dir, filenames
|
||||||
|
|
||||||
|
def find_data_files(self, package, src_dir):
|
||||||
|
"""Return filenames for package's data files in 'src_dir'"""
|
||||||
|
patterns = self._get_platform_patterns(
|
||||||
|
self.package_data,
|
||||||
|
package,
|
||||||
|
src_dir,
|
||||||
|
)
|
||||||
|
globs_expanded = map(glob, patterns)
|
||||||
|
# flatten the expanded globs into an iterable of matches
|
||||||
|
globs_matches = itertools.chain.from_iterable(globs_expanded)
|
||||||
|
glob_files = filter(os.path.isfile, globs_matches)
|
||||||
|
files = itertools.chain(
|
||||||
|
self.manifest_files.get(package, []),
|
||||||
|
glob_files,
|
||||||
|
)
|
||||||
|
return self.exclude_data_files(package, src_dir, files)
|
||||||
|
|
||||||
|
def build_package_data(self):
|
||||||
|
"""Copy data files into build directory"""
|
||||||
|
for package, src_dir, build_dir, filenames in self.data_files:
|
||||||
|
for filename in filenames:
|
||||||
|
target = os.path.join(build_dir, filename)
|
||||||
|
self.mkpath(os.path.dirname(target))
|
||||||
|
srcfile = os.path.join(src_dir, filename)
|
||||||
|
outf, copied = self.copy_file(srcfile, target)
|
||||||
|
srcfile = os.path.abspath(srcfile)
|
||||||
|
if (copied and
|
||||||
|
srcfile in self.distribution.convert_2to3_doctests):
|
||||||
|
self.__doctests_2to3.append(outf)
|
||||||
|
|
||||||
|
def analyze_manifest(self):
|
||||||
|
self.manifest_files = mf = {}
|
||||||
|
if not self.distribution.include_package_data:
|
||||||
|
return
|
||||||
|
src_dirs = {}
|
||||||
|
for package in self.packages or ():
|
||||||
|
# Locate package source directory
|
||||||
|
src_dirs[assert_relative(self.get_package_dir(package))] = package
|
||||||
|
|
||||||
|
self.run_command('egg_info')
|
||||||
|
ei_cmd = self.get_finalized_command('egg_info')
|
||||||
|
for path in ei_cmd.filelist.files:
|
||||||
|
d, f = os.path.split(assert_relative(path))
|
||||||
|
prev = None
|
||||||
|
oldf = f
|
||||||
|
while d and d != prev and d not in src_dirs:
|
||||||
|
prev = d
|
||||||
|
d, df = os.path.split(d)
|
||||||
|
f = os.path.join(df, f)
|
||||||
|
if d in src_dirs:
|
||||||
|
if path.endswith('.py') and f == oldf:
|
||||||
|
continue # it's a module, not data
|
||||||
|
mf.setdefault(src_dirs[d], []).append(path)
|
||||||
|
|
||||||
|
def get_data_files(self):
|
||||||
|
pass # Lazily compute data files in _get_data_files() function.
|
||||||
|
|
||||||
|
def check_package(self, package, package_dir):
|
||||||
|
"""Check namespace packages' __init__ for declare_namespace"""
|
||||||
|
try:
|
||||||
|
return self.packages_checked[package]
|
||||||
|
except KeyError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
init_py = orig.build_py.check_package(self, package, package_dir)
|
||||||
|
self.packages_checked[package] = init_py
|
||||||
|
|
||||||
|
if not init_py or not self.distribution.namespace_packages:
|
||||||
|
return init_py
|
||||||
|
|
||||||
|
for pkg in self.distribution.namespace_packages:
|
||||||
|
if pkg == package or pkg.startswith(package + '.'):
|
||||||
|
break
|
||||||
|
else:
|
||||||
|
return init_py
|
||||||
|
|
||||||
|
with io.open(init_py, 'rb') as f:
|
||||||
|
contents = f.read()
|
||||||
|
if b'declare_namespace' not in contents:
|
||||||
|
raise distutils.errors.DistutilsError(
|
||||||
|
"Namespace package problem: %s is a namespace package, but "
|
||||||
|
"its\n__init__.py does not call declare_namespace()! Please "
|
||||||
|
'fix it.\n(See the setuptools manual under '
|
||||||
|
'"Namespace Packages" for details.)\n"' % (package,)
|
||||||
|
)
|
||||||
|
return init_py
|
||||||
|
|
||||||
|
def initialize_options(self):
|
||||||
|
self.packages_checked = {}
|
||||||
|
orig.build_py.initialize_options(self)
|
||||||
|
|
||||||
|
def get_package_dir(self, package):
|
||||||
|
res = orig.build_py.get_package_dir(self, package)
|
||||||
|
if self.distribution.src_root is not None:
|
||||||
|
return os.path.join(self.distribution.src_root, res)
|
||||||
|
return res
|
||||||
|
|
||||||
|
def exclude_data_files(self, package, src_dir, files):
|
||||||
|
"""Filter filenames for package's data files in 'src_dir'"""
|
||||||
|
files = list(files)
|
||||||
|
patterns = self._get_platform_patterns(
|
||||||
|
self.exclude_package_data,
|
||||||
|
package,
|
||||||
|
src_dir,
|
||||||
|
)
|
||||||
|
match_groups = (
|
||||||
|
fnmatch.filter(files, pattern)
|
||||||
|
for pattern in patterns
|
||||||
|
)
|
||||||
|
# flatten the groups of matches into an iterable of matches
|
||||||
|
matches = itertools.chain.from_iterable(match_groups)
|
||||||
|
bad = set(matches)
|
||||||
|
keepers = (
|
||||||
|
fn
|
||||||
|
for fn in files
|
||||||
|
if fn not in bad
|
||||||
|
)
|
||||||
|
# ditch dupes
|
||||||
|
return list(_unique_everseen(keepers))
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _get_platform_patterns(spec, package, src_dir):
|
||||||
|
"""
|
||||||
|
yield platform-specific path patterns (suitable for glob
|
||||||
|
or fn_match) from a glob-based spec (such as
|
||||||
|
self.package_data or self.exclude_package_data)
|
||||||
|
matching package in src_dir.
|
||||||
|
"""
|
||||||
|
raw_patterns = itertools.chain(
|
||||||
|
spec.get('', []),
|
||||||
|
spec.get(package, []),
|
||||||
|
)
|
||||||
|
return (
|
||||||
|
# Each pattern has to be converted to a platform-specific path
|
||||||
|
os.path.join(src_dir, convert_path(pattern))
|
||||||
|
for pattern in raw_patterns
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# from Python docs
|
||||||
|
def _unique_everseen(iterable, key=None):
|
||||||
|
"List unique elements, preserving order. Remember all elements ever seen."
|
||||||
|
# unique_everseen('AAAABBBCCDAABBB') --> A B C D
|
||||||
|
# unique_everseen('ABBCcAD', str.lower) --> A B C D
|
||||||
|
seen = set()
|
||||||
|
seen_add = seen.add
|
||||||
|
if key is None:
|
||||||
|
for element in filterfalse(seen.__contains__, iterable):
|
||||||
|
seen_add(element)
|
||||||
|
yield element
|
||||||
|
else:
|
||||||
|
for element in iterable:
|
||||||
|
k = key(element)
|
||||||
|
if k not in seen:
|
||||||
|
seen_add(k)
|
||||||
|
yield element
|
||||||
|
|
||||||
|
|
||||||
|
def assert_relative(path):
|
||||||
|
if not os.path.isabs(path):
|
||||||
|
return path
|
||||||
|
from distutils.errors import DistutilsSetupError
|
||||||
|
|
||||||
|
msg = textwrap.dedent("""
|
||||||
|
Error: setup script specifies an absolute path:
|
||||||
|
|
||||||
|
%s
|
||||||
|
|
||||||
|
setup() arguments must *always* be /-separated paths relative to the
|
||||||
|
setup.py directory, *never* absolute paths.
|
||||||
|
""").lstrip() % path
|
||||||
|
raise DistutilsSetupError(msg)
|
||||||
@@ -0,0 +1,216 @@
|
|||||||
|
from distutils.util import convert_path
|
||||||
|
from distutils import log
|
||||||
|
from distutils.errors import DistutilsError, DistutilsOptionError
|
||||||
|
import os
|
||||||
|
import glob
|
||||||
|
import io
|
||||||
|
|
||||||
|
from setuptools.extern import six
|
||||||
|
|
||||||
|
from pkg_resources import Distribution, PathMetadata, normalize_path
|
||||||
|
from setuptools.command.easy_install import easy_install
|
||||||
|
from setuptools import namespaces
|
||||||
|
import setuptools
|
||||||
|
|
||||||
|
|
||||||
|
class develop(namespaces.DevelopInstaller, easy_install):
|
||||||
|
"""Set up package for development"""
|
||||||
|
|
||||||
|
description = "install package in 'development mode'"
|
||||||
|
|
||||||
|
user_options = easy_install.user_options + [
|
||||||
|
("uninstall", "u", "Uninstall this source package"),
|
||||||
|
("egg-path=", None, "Set the path to be used in the .egg-link file"),
|
||||||
|
]
|
||||||
|
|
||||||
|
boolean_options = easy_install.boolean_options + ['uninstall']
|
||||||
|
|
||||||
|
command_consumes_arguments = False # override base
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
if self.uninstall:
|
||||||
|
self.multi_version = True
|
||||||
|
self.uninstall_link()
|
||||||
|
self.uninstall_namespaces()
|
||||||
|
else:
|
||||||
|
self.install_for_development()
|
||||||
|
self.warn_deprecated_options()
|
||||||
|
|
||||||
|
def initialize_options(self):
|
||||||
|
self.uninstall = None
|
||||||
|
self.egg_path = None
|
||||||
|
easy_install.initialize_options(self)
|
||||||
|
self.setup_path = None
|
||||||
|
self.always_copy_from = '.' # always copy eggs installed in curdir
|
||||||
|
|
||||||
|
def finalize_options(self):
|
||||||
|
ei = self.get_finalized_command("egg_info")
|
||||||
|
if ei.broken_egg_info:
|
||||||
|
template = "Please rename %r to %r before using 'develop'"
|
||||||
|
args = ei.egg_info, ei.broken_egg_info
|
||||||
|
raise DistutilsError(template % args)
|
||||||
|
self.args = [ei.egg_name]
|
||||||
|
|
||||||
|
easy_install.finalize_options(self)
|
||||||
|
self.expand_basedirs()
|
||||||
|
self.expand_dirs()
|
||||||
|
# pick up setup-dir .egg files only: no .egg-info
|
||||||
|
self.package_index.scan(glob.glob('*.egg'))
|
||||||
|
|
||||||
|
egg_link_fn = ei.egg_name + '.egg-link'
|
||||||
|
self.egg_link = os.path.join(self.install_dir, egg_link_fn)
|
||||||
|
self.egg_base = ei.egg_base
|
||||||
|
if self.egg_path is None:
|
||||||
|
self.egg_path = os.path.abspath(ei.egg_base)
|
||||||
|
|
||||||
|
target = normalize_path(self.egg_base)
|
||||||
|
egg_path = normalize_path(os.path.join(self.install_dir,
|
||||||
|
self.egg_path))
|
||||||
|
if egg_path != target:
|
||||||
|
raise DistutilsOptionError(
|
||||||
|
"--egg-path must be a relative path from the install"
|
||||||
|
" directory to " + target
|
||||||
|
)
|
||||||
|
|
||||||
|
# Make a distribution for the package's source
|
||||||
|
self.dist = Distribution(
|
||||||
|
target,
|
||||||
|
PathMetadata(target, os.path.abspath(ei.egg_info)),
|
||||||
|
project_name=ei.egg_name
|
||||||
|
)
|
||||||
|
|
||||||
|
self.setup_path = self._resolve_setup_path(
|
||||||
|
self.egg_base,
|
||||||
|
self.install_dir,
|
||||||
|
self.egg_path,
|
||||||
|
)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _resolve_setup_path(egg_base, install_dir, egg_path):
|
||||||
|
"""
|
||||||
|
Generate a path from egg_base back to '.' where the
|
||||||
|
setup script resides and ensure that path points to the
|
||||||
|
setup path from $install_dir/$egg_path.
|
||||||
|
"""
|
||||||
|
path_to_setup = egg_base.replace(os.sep, '/').rstrip('/')
|
||||||
|
if path_to_setup != os.curdir:
|
||||||
|
path_to_setup = '../' * (path_to_setup.count('/') + 1)
|
||||||
|
resolved = normalize_path(
|
||||||
|
os.path.join(install_dir, egg_path, path_to_setup)
|
||||||
|
)
|
||||||
|
if resolved != normalize_path(os.curdir):
|
||||||
|
raise DistutilsOptionError(
|
||||||
|
"Can't get a consistent path to setup script from"
|
||||||
|
" installation directory", resolved, normalize_path(os.curdir))
|
||||||
|
return path_to_setup
|
||||||
|
|
||||||
|
def install_for_development(self):
|
||||||
|
if six.PY3 and getattr(self.distribution, 'use_2to3', False):
|
||||||
|
# If we run 2to3 we can not do this inplace:
|
||||||
|
|
||||||
|
# Ensure metadata is up-to-date
|
||||||
|
self.reinitialize_command('build_py', inplace=0)
|
||||||
|
self.run_command('build_py')
|
||||||
|
bpy_cmd = self.get_finalized_command("build_py")
|
||||||
|
build_path = normalize_path(bpy_cmd.build_lib)
|
||||||
|
|
||||||
|
# Build extensions
|
||||||
|
self.reinitialize_command('egg_info', egg_base=build_path)
|
||||||
|
self.run_command('egg_info')
|
||||||
|
|
||||||
|
self.reinitialize_command('build_ext', inplace=0)
|
||||||
|
self.run_command('build_ext')
|
||||||
|
|
||||||
|
# Fixup egg-link and easy-install.pth
|
||||||
|
ei_cmd = self.get_finalized_command("egg_info")
|
||||||
|
self.egg_path = build_path
|
||||||
|
self.dist.location = build_path
|
||||||
|
# XXX
|
||||||
|
self.dist._provider = PathMetadata(build_path, ei_cmd.egg_info)
|
||||||
|
else:
|
||||||
|
# Without 2to3 inplace works fine:
|
||||||
|
self.run_command('egg_info')
|
||||||
|
|
||||||
|
# Build extensions in-place
|
||||||
|
self.reinitialize_command('build_ext', inplace=1)
|
||||||
|
self.run_command('build_ext')
|
||||||
|
|
||||||
|
self.install_site_py() # ensure that target dir is site-safe
|
||||||
|
if setuptools.bootstrap_install_from:
|
||||||
|
self.easy_install(setuptools.bootstrap_install_from)
|
||||||
|
setuptools.bootstrap_install_from = None
|
||||||
|
|
||||||
|
self.install_namespaces()
|
||||||
|
|
||||||
|
# create an .egg-link in the installation dir, pointing to our egg
|
||||||
|
log.info("Creating %s (link to %s)", self.egg_link, self.egg_base)
|
||||||
|
if not self.dry_run:
|
||||||
|
with open(self.egg_link, "w") as f:
|
||||||
|
f.write(self.egg_path + "\n" + self.setup_path)
|
||||||
|
# postprocess the installed distro, fixing up .pth, installing scripts,
|
||||||
|
# and handling requirements
|
||||||
|
self.process_distribution(None, self.dist, not self.no_deps)
|
||||||
|
|
||||||
|
def uninstall_link(self):
|
||||||
|
if os.path.exists(self.egg_link):
|
||||||
|
log.info("Removing %s (link to %s)", self.egg_link, self.egg_base)
|
||||||
|
egg_link_file = open(self.egg_link)
|
||||||
|
contents = [line.rstrip() for line in egg_link_file]
|
||||||
|
egg_link_file.close()
|
||||||
|
if contents not in ([self.egg_path],
|
||||||
|
[self.egg_path, self.setup_path]):
|
||||||
|
log.warn("Link points to %s: uninstall aborted", contents)
|
||||||
|
return
|
||||||
|
if not self.dry_run:
|
||||||
|
os.unlink(self.egg_link)
|
||||||
|
if not self.dry_run:
|
||||||
|
self.update_pth(self.dist) # remove any .pth link to us
|
||||||
|
if self.distribution.scripts:
|
||||||
|
# XXX should also check for entry point scripts!
|
||||||
|
log.warn("Note: you must uninstall or replace scripts manually!")
|
||||||
|
|
||||||
|
def install_egg_scripts(self, dist):
|
||||||
|
if dist is not self.dist:
|
||||||
|
# Installing a dependency, so fall back to normal behavior
|
||||||
|
return easy_install.install_egg_scripts(self, dist)
|
||||||
|
|
||||||
|
# create wrapper scripts in the script dir, pointing to dist.scripts
|
||||||
|
|
||||||
|
# new-style...
|
||||||
|
self.install_wrapper_scripts(dist)
|
||||||
|
|
||||||
|
# ...and old-style
|
||||||
|
for script_name in self.distribution.scripts or []:
|
||||||
|
script_path = os.path.abspath(convert_path(script_name))
|
||||||
|
script_name = os.path.basename(script_path)
|
||||||
|
with io.open(script_path) as strm:
|
||||||
|
script_text = strm.read()
|
||||||
|
self.install_script(dist, script_name, script_text, script_path)
|
||||||
|
|
||||||
|
def install_wrapper_scripts(self, dist):
|
||||||
|
dist = VersionlessRequirement(dist)
|
||||||
|
return easy_install.install_wrapper_scripts(self, dist)
|
||||||
|
|
||||||
|
|
||||||
|
class VersionlessRequirement(object):
|
||||||
|
"""
|
||||||
|
Adapt a pkg_resources.Distribution to simply return the project
|
||||||
|
name as the 'requirement' so that scripts will work across
|
||||||
|
multiple versions.
|
||||||
|
|
||||||
|
>>> dist = Distribution(project_name='foo', version='1.0')
|
||||||
|
>>> str(dist.as_requirement())
|
||||||
|
'foo==1.0'
|
||||||
|
>>> adapted_dist = VersionlessRequirement(dist)
|
||||||
|
>>> str(adapted_dist.as_requirement())
|
||||||
|
'foo'
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, dist):
|
||||||
|
self.__dist = dist
|
||||||
|
|
||||||
|
def __getattr__(self, name):
|
||||||
|
return getattr(self.__dist, name)
|
||||||
|
|
||||||
|
def as_requirement(self):
|
||||||
|
return self.project_name
|
||||||
@@ -0,0 +1,36 @@
|
|||||||
|
"""
|
||||||
|
Create a dist_info directory
|
||||||
|
As defined in the wheel specification
|
||||||
|
"""
|
||||||
|
|
||||||
|
import os
|
||||||
|
|
||||||
|
from distutils.core import Command
|
||||||
|
from distutils import log
|
||||||
|
|
||||||
|
|
||||||
|
class dist_info(Command):
|
||||||
|
|
||||||
|
description = 'create a .dist-info directory'
|
||||||
|
|
||||||
|
user_options = [
|
||||||
|
('egg-base=', 'e', "directory containing .egg-info directories"
|
||||||
|
" (default: top of the source tree)"),
|
||||||
|
]
|
||||||
|
|
||||||
|
def initialize_options(self):
|
||||||
|
self.egg_base = None
|
||||||
|
|
||||||
|
def finalize_options(self):
|
||||||
|
pass
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
egg_info = self.get_finalized_command('egg_info')
|
||||||
|
egg_info.egg_base = self.egg_base
|
||||||
|
egg_info.finalize_options()
|
||||||
|
egg_info.run()
|
||||||
|
dist_info_dir = egg_info.egg_info[:-len('.egg-info')] + '.dist-info'
|
||||||
|
log.info("creating '{}'".format(os.path.abspath(dist_info_dir)))
|
||||||
|
|
||||||
|
bdist_wheel = self.get_finalized_command('bdist_wheel')
|
||||||
|
bdist_wheel.egg2dist(egg_info.egg_info, dist_info_dir)
|
||||||
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,696 @@
|
|||||||
|
"""setuptools.command.egg_info
|
||||||
|
|
||||||
|
Create a distribution's .egg-info directory and contents"""
|
||||||
|
|
||||||
|
from distutils.filelist import FileList as _FileList
|
||||||
|
from distutils.errors import DistutilsInternalError
|
||||||
|
from distutils.util import convert_path
|
||||||
|
from distutils import log
|
||||||
|
import distutils.errors
|
||||||
|
import distutils.filelist
|
||||||
|
import os
|
||||||
|
import re
|
||||||
|
import sys
|
||||||
|
import io
|
||||||
|
import warnings
|
||||||
|
import time
|
||||||
|
import collections
|
||||||
|
|
||||||
|
from setuptools.extern import six
|
||||||
|
from setuptools.extern.six.moves import map
|
||||||
|
|
||||||
|
from setuptools import Command
|
||||||
|
from setuptools.command.sdist import sdist
|
||||||
|
from setuptools.command.sdist import walk_revctrl
|
||||||
|
from setuptools.command.setopt import edit_config
|
||||||
|
from setuptools.command import bdist_egg
|
||||||
|
from pkg_resources import (
|
||||||
|
parse_requirements, safe_name, parse_version,
|
||||||
|
safe_version, yield_lines, EntryPoint, iter_entry_points, to_filename)
|
||||||
|
import setuptools.unicode_utils as unicode_utils
|
||||||
|
from setuptools.glob import glob
|
||||||
|
|
||||||
|
from setuptools.extern import packaging
|
||||||
|
|
||||||
|
|
||||||
|
def translate_pattern(glob):
|
||||||
|
"""
|
||||||
|
Translate a file path glob like '*.txt' in to a regular expression.
|
||||||
|
This differs from fnmatch.translate which allows wildcards to match
|
||||||
|
directory separators. It also knows about '**/' which matches any number of
|
||||||
|
directories.
|
||||||
|
"""
|
||||||
|
pat = ''
|
||||||
|
|
||||||
|
# This will split on '/' within [character classes]. This is deliberate.
|
||||||
|
chunks = glob.split(os.path.sep)
|
||||||
|
|
||||||
|
sep = re.escape(os.sep)
|
||||||
|
valid_char = '[^%s]' % (sep,)
|
||||||
|
|
||||||
|
for c, chunk in enumerate(chunks):
|
||||||
|
last_chunk = c == len(chunks) - 1
|
||||||
|
|
||||||
|
# Chunks that are a literal ** are globstars. They match anything.
|
||||||
|
if chunk == '**':
|
||||||
|
if last_chunk:
|
||||||
|
# Match anything if this is the last component
|
||||||
|
pat += '.*'
|
||||||
|
else:
|
||||||
|
# Match '(name/)*'
|
||||||
|
pat += '(?:%s+%s)*' % (valid_char, sep)
|
||||||
|
continue # Break here as the whole path component has been handled
|
||||||
|
|
||||||
|
# Find any special characters in the remainder
|
||||||
|
i = 0
|
||||||
|
chunk_len = len(chunk)
|
||||||
|
while i < chunk_len:
|
||||||
|
char = chunk[i]
|
||||||
|
if char == '*':
|
||||||
|
# Match any number of name characters
|
||||||
|
pat += valid_char + '*'
|
||||||
|
elif char == '?':
|
||||||
|
# Match a name character
|
||||||
|
pat += valid_char
|
||||||
|
elif char == '[':
|
||||||
|
# Character class
|
||||||
|
inner_i = i + 1
|
||||||
|
# Skip initial !/] chars
|
||||||
|
if inner_i < chunk_len and chunk[inner_i] == '!':
|
||||||
|
inner_i = inner_i + 1
|
||||||
|
if inner_i < chunk_len and chunk[inner_i] == ']':
|
||||||
|
inner_i = inner_i + 1
|
||||||
|
|
||||||
|
# Loop till the closing ] is found
|
||||||
|
while inner_i < chunk_len and chunk[inner_i] != ']':
|
||||||
|
inner_i = inner_i + 1
|
||||||
|
|
||||||
|
if inner_i >= chunk_len:
|
||||||
|
# Got to the end of the string without finding a closing ]
|
||||||
|
# Do not treat this as a matching group, but as a literal [
|
||||||
|
pat += re.escape(char)
|
||||||
|
else:
|
||||||
|
# Grab the insides of the [brackets]
|
||||||
|
inner = chunk[i + 1:inner_i]
|
||||||
|
char_class = ''
|
||||||
|
|
||||||
|
# Class negation
|
||||||
|
if inner[0] == '!':
|
||||||
|
char_class = '^'
|
||||||
|
inner = inner[1:]
|
||||||
|
|
||||||
|
char_class += re.escape(inner)
|
||||||
|
pat += '[%s]' % (char_class,)
|
||||||
|
|
||||||
|
# Skip to the end ]
|
||||||
|
i = inner_i
|
||||||
|
else:
|
||||||
|
pat += re.escape(char)
|
||||||
|
i += 1
|
||||||
|
|
||||||
|
# Join each chunk with the dir separator
|
||||||
|
if not last_chunk:
|
||||||
|
pat += sep
|
||||||
|
|
||||||
|
pat += r'\Z'
|
||||||
|
return re.compile(pat, flags=re.MULTILINE|re.DOTALL)
|
||||||
|
|
||||||
|
|
||||||
|
class egg_info(Command):
|
||||||
|
description = "create a distribution's .egg-info directory"
|
||||||
|
|
||||||
|
user_options = [
|
||||||
|
('egg-base=', 'e', "directory containing .egg-info directories"
|
||||||
|
" (default: top of the source tree)"),
|
||||||
|
('tag-date', 'd', "Add date stamp (e.g. 20050528) to version number"),
|
||||||
|
('tag-build=', 'b', "Specify explicit tag to add to version number"),
|
||||||
|
('no-date', 'D', "Don't include date stamp [default]"),
|
||||||
|
]
|
||||||
|
|
||||||
|
boolean_options = ['tag-date']
|
||||||
|
negative_opt = {
|
||||||
|
'no-date': 'tag-date',
|
||||||
|
}
|
||||||
|
|
||||||
|
def initialize_options(self):
|
||||||
|
self.egg_name = None
|
||||||
|
self.egg_version = None
|
||||||
|
self.egg_base = None
|
||||||
|
self.egg_info = None
|
||||||
|
self.tag_build = None
|
||||||
|
self.tag_date = 0
|
||||||
|
self.broken_egg_info = False
|
||||||
|
self.vtags = None
|
||||||
|
|
||||||
|
####################################
|
||||||
|
# allow the 'tag_svn_revision' to be detected and
|
||||||
|
# set, supporting sdists built on older Setuptools.
|
||||||
|
@property
|
||||||
|
def tag_svn_revision(self):
|
||||||
|
pass
|
||||||
|
|
||||||
|
@tag_svn_revision.setter
|
||||||
|
def tag_svn_revision(self, value):
|
||||||
|
pass
|
||||||
|
####################################
|
||||||
|
|
||||||
|
def save_version_info(self, filename):
|
||||||
|
"""
|
||||||
|
Materialize the value of date into the
|
||||||
|
build tag. Install build keys in a deterministic order
|
||||||
|
to avoid arbitrary reordering on subsequent builds.
|
||||||
|
"""
|
||||||
|
egg_info = collections.OrderedDict()
|
||||||
|
# follow the order these keys would have been added
|
||||||
|
# when PYTHONHASHSEED=0
|
||||||
|
egg_info['tag_build'] = self.tags()
|
||||||
|
egg_info['tag_date'] = 0
|
||||||
|
edit_config(filename, dict(egg_info=egg_info))
|
||||||
|
|
||||||
|
def finalize_options(self):
|
||||||
|
self.egg_name = safe_name(self.distribution.get_name())
|
||||||
|
self.vtags = self.tags()
|
||||||
|
self.egg_version = self.tagged_version()
|
||||||
|
|
||||||
|
parsed_version = parse_version(self.egg_version)
|
||||||
|
|
||||||
|
try:
|
||||||
|
is_version = isinstance(parsed_version, packaging.version.Version)
|
||||||
|
spec = (
|
||||||
|
"%s==%s" if is_version else "%s===%s"
|
||||||
|
)
|
||||||
|
list(
|
||||||
|
parse_requirements(spec % (self.egg_name, self.egg_version))
|
||||||
|
)
|
||||||
|
except ValueError:
|
||||||
|
raise distutils.errors.DistutilsOptionError(
|
||||||
|
"Invalid distribution name or version syntax: %s-%s" %
|
||||||
|
(self.egg_name, self.egg_version)
|
||||||
|
)
|
||||||
|
|
||||||
|
if self.egg_base is None:
|
||||||
|
dirs = self.distribution.package_dir
|
||||||
|
self.egg_base = (dirs or {}).get('', os.curdir)
|
||||||
|
|
||||||
|
self.ensure_dirname('egg_base')
|
||||||
|
self.egg_info = to_filename(self.egg_name) + '.egg-info'
|
||||||
|
if self.egg_base != os.curdir:
|
||||||
|
self.egg_info = os.path.join(self.egg_base, self.egg_info)
|
||||||
|
if '-' in self.egg_name:
|
||||||
|
self.check_broken_egg_info()
|
||||||
|
|
||||||
|
# Set package version for the benefit of dumber commands
|
||||||
|
# (e.g. sdist, bdist_wininst, etc.)
|
||||||
|
#
|
||||||
|
self.distribution.metadata.version = self.egg_version
|
||||||
|
|
||||||
|
# If we bootstrapped around the lack of a PKG-INFO, as might be the
|
||||||
|
# case in a fresh checkout, make sure that any special tags get added
|
||||||
|
# to the version info
|
||||||
|
#
|
||||||
|
pd = self.distribution._patched_dist
|
||||||
|
if pd is not None and pd.key == self.egg_name.lower():
|
||||||
|
pd._version = self.egg_version
|
||||||
|
pd._parsed_version = parse_version(self.egg_version)
|
||||||
|
self.distribution._patched_dist = None
|
||||||
|
|
||||||
|
def write_or_delete_file(self, what, filename, data, force=False):
|
||||||
|
"""Write `data` to `filename` or delete if empty
|
||||||
|
|
||||||
|
If `data` is non-empty, this routine is the same as ``write_file()``.
|
||||||
|
If `data` is empty but not ``None``, this is the same as calling
|
||||||
|
``delete_file(filename)`. If `data` is ``None``, then this is a no-op
|
||||||
|
unless `filename` exists, in which case a warning is issued about the
|
||||||
|
orphaned file (if `force` is false), or deleted (if `force` is true).
|
||||||
|
"""
|
||||||
|
if data:
|
||||||
|
self.write_file(what, filename, data)
|
||||||
|
elif os.path.exists(filename):
|
||||||
|
if data is None and not force:
|
||||||
|
log.warn(
|
||||||
|
"%s not set in setup(), but %s exists", what, filename
|
||||||
|
)
|
||||||
|
return
|
||||||
|
else:
|
||||||
|
self.delete_file(filename)
|
||||||
|
|
||||||
|
def write_file(self, what, filename, data):
|
||||||
|
"""Write `data` to `filename` (if not a dry run) after announcing it
|
||||||
|
|
||||||
|
`what` is used in a log message to identify what is being written
|
||||||
|
to the file.
|
||||||
|
"""
|
||||||
|
log.info("writing %s to %s", what, filename)
|
||||||
|
if six.PY3:
|
||||||
|
data = data.encode("utf-8")
|
||||||
|
if not self.dry_run:
|
||||||
|
f = open(filename, 'wb')
|
||||||
|
f.write(data)
|
||||||
|
f.close()
|
||||||
|
|
||||||
|
def delete_file(self, filename):
|
||||||
|
"""Delete `filename` (if not a dry run) after announcing it"""
|
||||||
|
log.info("deleting %s", filename)
|
||||||
|
if not self.dry_run:
|
||||||
|
os.unlink(filename)
|
||||||
|
|
||||||
|
def tagged_version(self):
|
||||||
|
version = self.distribution.get_version()
|
||||||
|
# egg_info may be called more than once for a distribution,
|
||||||
|
# in which case the version string already contains all tags.
|
||||||
|
if self.vtags and version.endswith(self.vtags):
|
||||||
|
return safe_version(version)
|
||||||
|
return safe_version(version + self.vtags)
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
self.mkpath(self.egg_info)
|
||||||
|
installer = self.distribution.fetch_build_egg
|
||||||
|
for ep in iter_entry_points('egg_info.writers'):
|
||||||
|
ep.require(installer=installer)
|
||||||
|
writer = ep.resolve()
|
||||||
|
writer(self, ep.name, os.path.join(self.egg_info, ep.name))
|
||||||
|
|
||||||
|
# Get rid of native_libs.txt if it was put there by older bdist_egg
|
||||||
|
nl = os.path.join(self.egg_info, "native_libs.txt")
|
||||||
|
if os.path.exists(nl):
|
||||||
|
self.delete_file(nl)
|
||||||
|
|
||||||
|
self.find_sources()
|
||||||
|
|
||||||
|
def tags(self):
|
||||||
|
version = ''
|
||||||
|
if self.tag_build:
|
||||||
|
version += self.tag_build
|
||||||
|
if self.tag_date:
|
||||||
|
version += time.strftime("-%Y%m%d")
|
||||||
|
return version
|
||||||
|
|
||||||
|
def find_sources(self):
|
||||||
|
"""Generate SOURCES.txt manifest file"""
|
||||||
|
manifest_filename = os.path.join(self.egg_info, "SOURCES.txt")
|
||||||
|
mm = manifest_maker(self.distribution)
|
||||||
|
mm.manifest = manifest_filename
|
||||||
|
mm.run()
|
||||||
|
self.filelist = mm.filelist
|
||||||
|
|
||||||
|
def check_broken_egg_info(self):
|
||||||
|
bei = self.egg_name + '.egg-info'
|
||||||
|
if self.egg_base != os.curdir:
|
||||||
|
bei = os.path.join(self.egg_base, bei)
|
||||||
|
if os.path.exists(bei):
|
||||||
|
log.warn(
|
||||||
|
"-" * 78 + '\n'
|
||||||
|
"Note: Your current .egg-info directory has a '-' in its name;"
|
||||||
|
'\nthis will not work correctly with "setup.py develop".\n\n'
|
||||||
|
'Please rename %s to %s to correct this problem.\n' + '-' * 78,
|
||||||
|
bei, self.egg_info
|
||||||
|
)
|
||||||
|
self.broken_egg_info = self.egg_info
|
||||||
|
self.egg_info = bei # make it work for now
|
||||||
|
|
||||||
|
|
||||||
|
class FileList(_FileList):
|
||||||
|
# Implementations of the various MANIFEST.in commands
|
||||||
|
|
||||||
|
def process_template_line(self, line):
|
||||||
|
# Parse the line: split it up, make sure the right number of words
|
||||||
|
# is there, and return the relevant words. 'action' is always
|
||||||
|
# defined: it's the first word of the line. Which of the other
|
||||||
|
# three are defined depends on the action; it'll be either
|
||||||
|
# patterns, (dir and patterns), or (dir_pattern).
|
||||||
|
(action, patterns, dir, dir_pattern) = self._parse_template_line(line)
|
||||||
|
|
||||||
|
# OK, now we know that the action is valid and we have the
|
||||||
|
# right number of words on the line for that action -- so we
|
||||||
|
# can proceed with minimal error-checking.
|
||||||
|
if action == 'include':
|
||||||
|
self.debug_print("include " + ' '.join(patterns))
|
||||||
|
for pattern in patterns:
|
||||||
|
if not self.include(pattern):
|
||||||
|
log.warn("warning: no files found matching '%s'", pattern)
|
||||||
|
|
||||||
|
elif action == 'exclude':
|
||||||
|
self.debug_print("exclude " + ' '.join(patterns))
|
||||||
|
for pattern in patterns:
|
||||||
|
if not self.exclude(pattern):
|
||||||
|
log.warn(("warning: no previously-included files "
|
||||||
|
"found matching '%s'"), pattern)
|
||||||
|
|
||||||
|
elif action == 'global-include':
|
||||||
|
self.debug_print("global-include " + ' '.join(patterns))
|
||||||
|
for pattern in patterns:
|
||||||
|
if not self.global_include(pattern):
|
||||||
|
log.warn(("warning: no files found matching '%s' "
|
||||||
|
"anywhere in distribution"), pattern)
|
||||||
|
|
||||||
|
elif action == 'global-exclude':
|
||||||
|
self.debug_print("global-exclude " + ' '.join(patterns))
|
||||||
|
for pattern in patterns:
|
||||||
|
if not self.global_exclude(pattern):
|
||||||
|
log.warn(("warning: no previously-included files matching "
|
||||||
|
"'%s' found anywhere in distribution"),
|
||||||
|
pattern)
|
||||||
|
|
||||||
|
elif action == 'recursive-include':
|
||||||
|
self.debug_print("recursive-include %s %s" %
|
||||||
|
(dir, ' '.join(patterns)))
|
||||||
|
for pattern in patterns:
|
||||||
|
if not self.recursive_include(dir, pattern):
|
||||||
|
log.warn(("warning: no files found matching '%s' "
|
||||||
|
"under directory '%s'"),
|
||||||
|
pattern, dir)
|
||||||
|
|
||||||
|
elif action == 'recursive-exclude':
|
||||||
|
self.debug_print("recursive-exclude %s %s" %
|
||||||
|
(dir, ' '.join(patterns)))
|
||||||
|
for pattern in patterns:
|
||||||
|
if not self.recursive_exclude(dir, pattern):
|
||||||
|
log.warn(("warning: no previously-included files matching "
|
||||||
|
"'%s' found under directory '%s'"),
|
||||||
|
pattern, dir)
|
||||||
|
|
||||||
|
elif action == 'graft':
|
||||||
|
self.debug_print("graft " + dir_pattern)
|
||||||
|
if not self.graft(dir_pattern):
|
||||||
|
log.warn("warning: no directories found matching '%s'",
|
||||||
|
dir_pattern)
|
||||||
|
|
||||||
|
elif action == 'prune':
|
||||||
|
self.debug_print("prune " + dir_pattern)
|
||||||
|
if not self.prune(dir_pattern):
|
||||||
|
log.warn(("no previously-included directories found "
|
||||||
|
"matching '%s'"), dir_pattern)
|
||||||
|
|
||||||
|
else:
|
||||||
|
raise DistutilsInternalError(
|
||||||
|
"this cannot happen: invalid action '%s'" % action)
|
||||||
|
|
||||||
|
def _remove_files(self, predicate):
|
||||||
|
"""
|
||||||
|
Remove all files from the file list that match the predicate.
|
||||||
|
Return True if any matching files were removed
|
||||||
|
"""
|
||||||
|
found = False
|
||||||
|
for i in range(len(self.files) - 1, -1, -1):
|
||||||
|
if predicate(self.files[i]):
|
||||||
|
self.debug_print(" removing " + self.files[i])
|
||||||
|
del self.files[i]
|
||||||
|
found = True
|
||||||
|
return found
|
||||||
|
|
||||||
|
def include(self, pattern):
|
||||||
|
"""Include files that match 'pattern'."""
|
||||||
|
found = [f for f in glob(pattern) if not os.path.isdir(f)]
|
||||||
|
self.extend(found)
|
||||||
|
return bool(found)
|
||||||
|
|
||||||
|
def exclude(self, pattern):
|
||||||
|
"""Exclude files that match 'pattern'."""
|
||||||
|
match = translate_pattern(pattern)
|
||||||
|
return self._remove_files(match.match)
|
||||||
|
|
||||||
|
def recursive_include(self, dir, pattern):
|
||||||
|
"""
|
||||||
|
Include all files anywhere in 'dir/' that match the pattern.
|
||||||
|
"""
|
||||||
|
full_pattern = os.path.join(dir, '**', pattern)
|
||||||
|
found = [f for f in glob(full_pattern, recursive=True)
|
||||||
|
if not os.path.isdir(f)]
|
||||||
|
self.extend(found)
|
||||||
|
return bool(found)
|
||||||
|
|
||||||
|
def recursive_exclude(self, dir, pattern):
|
||||||
|
"""
|
||||||
|
Exclude any file anywhere in 'dir/' that match the pattern.
|
||||||
|
"""
|
||||||
|
match = translate_pattern(os.path.join(dir, '**', pattern))
|
||||||
|
return self._remove_files(match.match)
|
||||||
|
|
||||||
|
def graft(self, dir):
|
||||||
|
"""Include all files from 'dir/'."""
|
||||||
|
found = [
|
||||||
|
item
|
||||||
|
for match_dir in glob(dir)
|
||||||
|
for item in distutils.filelist.findall(match_dir)
|
||||||
|
]
|
||||||
|
self.extend(found)
|
||||||
|
return bool(found)
|
||||||
|
|
||||||
|
def prune(self, dir):
|
||||||
|
"""Filter out files from 'dir/'."""
|
||||||
|
match = translate_pattern(os.path.join(dir, '**'))
|
||||||
|
return self._remove_files(match.match)
|
||||||
|
|
||||||
|
def global_include(self, pattern):
|
||||||
|
"""
|
||||||
|
Include all files anywhere in the current directory that match the
|
||||||
|
pattern. This is very inefficient on large file trees.
|
||||||
|
"""
|
||||||
|
if self.allfiles is None:
|
||||||
|
self.findall()
|
||||||
|
match = translate_pattern(os.path.join('**', pattern))
|
||||||
|
found = [f for f in self.allfiles if match.match(f)]
|
||||||
|
self.extend(found)
|
||||||
|
return bool(found)
|
||||||
|
|
||||||
|
def global_exclude(self, pattern):
|
||||||
|
"""
|
||||||
|
Exclude all files anywhere that match the pattern.
|
||||||
|
"""
|
||||||
|
match = translate_pattern(os.path.join('**', pattern))
|
||||||
|
return self._remove_files(match.match)
|
||||||
|
|
||||||
|
def append(self, item):
|
||||||
|
if item.endswith('\r'): # Fix older sdists built on Windows
|
||||||
|
item = item[:-1]
|
||||||
|
path = convert_path(item)
|
||||||
|
|
||||||
|
if self._safe_path(path):
|
||||||
|
self.files.append(path)
|
||||||
|
|
||||||
|
def extend(self, paths):
|
||||||
|
self.files.extend(filter(self._safe_path, paths))
|
||||||
|
|
||||||
|
def _repair(self):
|
||||||
|
"""
|
||||||
|
Replace self.files with only safe paths
|
||||||
|
|
||||||
|
Because some owners of FileList manipulate the underlying
|
||||||
|
``files`` attribute directly, this method must be called to
|
||||||
|
repair those paths.
|
||||||
|
"""
|
||||||
|
self.files = list(filter(self._safe_path, self.files))
|
||||||
|
|
||||||
|
def _safe_path(self, path):
|
||||||
|
enc_warn = "'%s' not %s encodable -- skipping"
|
||||||
|
|
||||||
|
# To avoid accidental trans-codings errors, first to unicode
|
||||||
|
u_path = unicode_utils.filesys_decode(path)
|
||||||
|
if u_path is None:
|
||||||
|
log.warn("'%s' in unexpected encoding -- skipping" % path)
|
||||||
|
return False
|
||||||
|
|
||||||
|
# Must ensure utf-8 encodability
|
||||||
|
utf8_path = unicode_utils.try_encode(u_path, "utf-8")
|
||||||
|
if utf8_path is None:
|
||||||
|
log.warn(enc_warn, path, 'utf-8')
|
||||||
|
return False
|
||||||
|
|
||||||
|
try:
|
||||||
|
# accept is either way checks out
|
||||||
|
if os.path.exists(u_path) or os.path.exists(utf8_path):
|
||||||
|
return True
|
||||||
|
# this will catch any encode errors decoding u_path
|
||||||
|
except UnicodeEncodeError:
|
||||||
|
log.warn(enc_warn, path, sys.getfilesystemencoding())
|
||||||
|
|
||||||
|
|
||||||
|
class manifest_maker(sdist):
|
||||||
|
template = "MANIFEST.in"
|
||||||
|
|
||||||
|
def initialize_options(self):
|
||||||
|
self.use_defaults = 1
|
||||||
|
self.prune = 1
|
||||||
|
self.manifest_only = 1
|
||||||
|
self.force_manifest = 1
|
||||||
|
|
||||||
|
def finalize_options(self):
|
||||||
|
pass
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
self.filelist = FileList()
|
||||||
|
if not os.path.exists(self.manifest):
|
||||||
|
self.write_manifest() # it must exist so it'll get in the list
|
||||||
|
self.add_defaults()
|
||||||
|
if os.path.exists(self.template):
|
||||||
|
self.read_template()
|
||||||
|
self.prune_file_list()
|
||||||
|
self.filelist.sort()
|
||||||
|
self.filelist.remove_duplicates()
|
||||||
|
self.write_manifest()
|
||||||
|
|
||||||
|
def _manifest_normalize(self, path):
|
||||||
|
path = unicode_utils.filesys_decode(path)
|
||||||
|
return path.replace(os.sep, '/')
|
||||||
|
|
||||||
|
def write_manifest(self):
|
||||||
|
"""
|
||||||
|
Write the file list in 'self.filelist' to the manifest file
|
||||||
|
named by 'self.manifest'.
|
||||||
|
"""
|
||||||
|
self.filelist._repair()
|
||||||
|
|
||||||
|
# Now _repairs should encodability, but not unicode
|
||||||
|
files = [self._manifest_normalize(f) for f in self.filelist.files]
|
||||||
|
msg = "writing manifest file '%s'" % self.manifest
|
||||||
|
self.execute(write_file, (self.manifest, files), msg)
|
||||||
|
|
||||||
|
def warn(self, msg):
|
||||||
|
if not self._should_suppress_warning(msg):
|
||||||
|
sdist.warn(self, msg)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _should_suppress_warning(msg):
|
||||||
|
"""
|
||||||
|
suppress missing-file warnings from sdist
|
||||||
|
"""
|
||||||
|
return re.match(r"standard file .*not found", msg)
|
||||||
|
|
||||||
|
def add_defaults(self):
|
||||||
|
sdist.add_defaults(self)
|
||||||
|
self.filelist.append(self.template)
|
||||||
|
self.filelist.append(self.manifest)
|
||||||
|
rcfiles = list(walk_revctrl())
|
||||||
|
if rcfiles:
|
||||||
|
self.filelist.extend(rcfiles)
|
||||||
|
elif os.path.exists(self.manifest):
|
||||||
|
self.read_manifest()
|
||||||
|
ei_cmd = self.get_finalized_command('egg_info')
|
||||||
|
self.filelist.graft(ei_cmd.egg_info)
|
||||||
|
|
||||||
|
def prune_file_list(self):
|
||||||
|
build = self.get_finalized_command('build')
|
||||||
|
base_dir = self.distribution.get_fullname()
|
||||||
|
self.filelist.prune(build.build_base)
|
||||||
|
self.filelist.prune(base_dir)
|
||||||
|
sep = re.escape(os.sep)
|
||||||
|
self.filelist.exclude_pattern(r'(^|' + sep + r')(RCS|CVS|\.svn)' + sep,
|
||||||
|
is_regex=1)
|
||||||
|
|
||||||
|
|
||||||
|
def write_file(filename, contents):
|
||||||
|
"""Create a file with the specified name and write 'contents' (a
|
||||||
|
sequence of strings without line terminators) to it.
|
||||||
|
"""
|
||||||
|
contents = "\n".join(contents)
|
||||||
|
|
||||||
|
# assuming the contents has been vetted for utf-8 encoding
|
||||||
|
contents = contents.encode("utf-8")
|
||||||
|
|
||||||
|
with open(filename, "wb") as f: # always write POSIX-style manifest
|
||||||
|
f.write(contents)
|
||||||
|
|
||||||
|
|
||||||
|
def write_pkg_info(cmd, basename, filename):
|
||||||
|
log.info("writing %s", filename)
|
||||||
|
if not cmd.dry_run:
|
||||||
|
metadata = cmd.distribution.metadata
|
||||||
|
metadata.version, oldver = cmd.egg_version, metadata.version
|
||||||
|
metadata.name, oldname = cmd.egg_name, metadata.name
|
||||||
|
|
||||||
|
try:
|
||||||
|
# write unescaped data to PKG-INFO, so older pkg_resources
|
||||||
|
# can still parse it
|
||||||
|
metadata.write_pkg_info(cmd.egg_info)
|
||||||
|
finally:
|
||||||
|
metadata.name, metadata.version = oldname, oldver
|
||||||
|
|
||||||
|
safe = getattr(cmd.distribution, 'zip_safe', None)
|
||||||
|
|
||||||
|
bdist_egg.write_safety_flag(cmd.egg_info, safe)
|
||||||
|
|
||||||
|
|
||||||
|
def warn_depends_obsolete(cmd, basename, filename):
|
||||||
|
if os.path.exists(filename):
|
||||||
|
log.warn(
|
||||||
|
"WARNING: 'depends.txt' is not used by setuptools 0.6!\n"
|
||||||
|
"Use the install_requires/extras_require setup() args instead."
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _write_requirements(stream, reqs):
|
||||||
|
lines = yield_lines(reqs or ())
|
||||||
|
append_cr = lambda line: line + '\n'
|
||||||
|
lines = map(append_cr, lines)
|
||||||
|
stream.writelines(lines)
|
||||||
|
|
||||||
|
|
||||||
|
def write_requirements(cmd, basename, filename):
|
||||||
|
dist = cmd.distribution
|
||||||
|
data = six.StringIO()
|
||||||
|
_write_requirements(data, dist.install_requires)
|
||||||
|
extras_require = dist.extras_require or {}
|
||||||
|
for extra in sorted(extras_require):
|
||||||
|
data.write('\n[{extra}]\n'.format(**vars()))
|
||||||
|
_write_requirements(data, extras_require[extra])
|
||||||
|
cmd.write_or_delete_file("requirements", filename, data.getvalue())
|
||||||
|
|
||||||
|
|
||||||
|
def write_setup_requirements(cmd, basename, filename):
|
||||||
|
data = io.StringIO()
|
||||||
|
_write_requirements(data, cmd.distribution.setup_requires)
|
||||||
|
cmd.write_or_delete_file("setup-requirements", filename, data.getvalue())
|
||||||
|
|
||||||
|
|
||||||
|
def write_toplevel_names(cmd, basename, filename):
|
||||||
|
pkgs = dict.fromkeys(
|
||||||
|
[
|
||||||
|
k.split('.', 1)[0]
|
||||||
|
for k in cmd.distribution.iter_distribution_names()
|
||||||
|
]
|
||||||
|
)
|
||||||
|
cmd.write_file("top-level names", filename, '\n'.join(sorted(pkgs)) + '\n')
|
||||||
|
|
||||||
|
|
||||||
|
def overwrite_arg(cmd, basename, filename):
|
||||||
|
write_arg(cmd, basename, filename, True)
|
||||||
|
|
||||||
|
|
||||||
|
def write_arg(cmd, basename, filename, force=False):
|
||||||
|
argname = os.path.splitext(basename)[0]
|
||||||
|
value = getattr(cmd.distribution, argname, None)
|
||||||
|
if value is not None:
|
||||||
|
value = '\n'.join(value) + '\n'
|
||||||
|
cmd.write_or_delete_file(argname, filename, value, force)
|
||||||
|
|
||||||
|
|
||||||
|
def write_entries(cmd, basename, filename):
|
||||||
|
ep = cmd.distribution.entry_points
|
||||||
|
|
||||||
|
if isinstance(ep, six.string_types) or ep is None:
|
||||||
|
data = ep
|
||||||
|
elif ep is not None:
|
||||||
|
data = []
|
||||||
|
for section, contents in sorted(ep.items()):
|
||||||
|
if not isinstance(contents, six.string_types):
|
||||||
|
contents = EntryPoint.parse_group(section, contents)
|
||||||
|
contents = '\n'.join(sorted(map(str, contents.values())))
|
||||||
|
data.append('[%s]\n%s\n\n' % (section, contents))
|
||||||
|
data = ''.join(data)
|
||||||
|
|
||||||
|
cmd.write_or_delete_file('entry points', filename, data, True)
|
||||||
|
|
||||||
|
|
||||||
|
def get_pkg_info_revision():
|
||||||
|
"""
|
||||||
|
Get a -r### off of PKG-INFO Version in case this is an sdist of
|
||||||
|
a subversion revision.
|
||||||
|
"""
|
||||||
|
warnings.warn("get_pkg_info_revision is deprecated.", DeprecationWarning)
|
||||||
|
if os.path.exists('PKG-INFO'):
|
||||||
|
with io.open('PKG-INFO') as f:
|
||||||
|
for line in f:
|
||||||
|
match = re.match(r"Version:.*-r(\d+)\s*$", line)
|
||||||
|
if match:
|
||||||
|
return int(match.group(1))
|
||||||
|
return 0
|
||||||
@@ -0,0 +1,125 @@
|
|||||||
|
from distutils.errors import DistutilsArgError
|
||||||
|
import inspect
|
||||||
|
import glob
|
||||||
|
import warnings
|
||||||
|
import platform
|
||||||
|
import distutils.command.install as orig
|
||||||
|
|
||||||
|
import setuptools
|
||||||
|
|
||||||
|
# Prior to numpy 1.9, NumPy relies on the '_install' name, so provide it for
|
||||||
|
# now. See https://github.com/pypa/setuptools/issues/199/
|
||||||
|
_install = orig.install
|
||||||
|
|
||||||
|
|
||||||
|
class install(orig.install):
|
||||||
|
"""Use easy_install to install the package, w/dependencies"""
|
||||||
|
|
||||||
|
user_options = orig.install.user_options + [
|
||||||
|
('old-and-unmanageable', None, "Try not to use this!"),
|
||||||
|
('single-version-externally-managed', None,
|
||||||
|
"used by system package builders to create 'flat' eggs"),
|
||||||
|
]
|
||||||
|
boolean_options = orig.install.boolean_options + [
|
||||||
|
'old-and-unmanageable', 'single-version-externally-managed',
|
||||||
|
]
|
||||||
|
new_commands = [
|
||||||
|
('install_egg_info', lambda self: True),
|
||||||
|
('install_scripts', lambda self: True),
|
||||||
|
]
|
||||||
|
_nc = dict(new_commands)
|
||||||
|
|
||||||
|
def initialize_options(self):
|
||||||
|
orig.install.initialize_options(self)
|
||||||
|
self.old_and_unmanageable = None
|
||||||
|
self.single_version_externally_managed = None
|
||||||
|
|
||||||
|
def finalize_options(self):
|
||||||
|
orig.install.finalize_options(self)
|
||||||
|
if self.root:
|
||||||
|
self.single_version_externally_managed = True
|
||||||
|
elif self.single_version_externally_managed:
|
||||||
|
if not self.root and not self.record:
|
||||||
|
raise DistutilsArgError(
|
||||||
|
"You must specify --record or --root when building system"
|
||||||
|
" packages"
|
||||||
|
)
|
||||||
|
|
||||||
|
def handle_extra_path(self):
|
||||||
|
if self.root or self.single_version_externally_managed:
|
||||||
|
# explicit backward-compatibility mode, allow extra_path to work
|
||||||
|
return orig.install.handle_extra_path(self)
|
||||||
|
|
||||||
|
# Ignore extra_path when installing an egg (or being run by another
|
||||||
|
# command without --root or --single-version-externally-managed
|
||||||
|
self.path_file = None
|
||||||
|
self.extra_dirs = ''
|
||||||
|
|
||||||
|
def run(self):
|
||||||
|
# Explicit request for old-style install? Just do it
|
||||||
|
if self.old_and_unmanageable or self.single_version_externally_managed:
|
||||||
|
return orig.install.run(self)
|
||||||
|
|
||||||
|
if not self._called_from_setup(inspect.currentframe()):
|
||||||
|
# Run in backward-compatibility mode to support bdist_* commands.
|
||||||
|
orig.install.run(self)
|
||||||
|
else:
|
||||||
|
self.do_egg_install()
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _called_from_setup(run_frame):
|
||||||
|
"""
|
||||||
|
Attempt to detect whether run() was called from setup() or by another
|
||||||
|
command. If called by setup(), the parent caller will be the
|
||||||
|
'run_command' method in 'distutils.dist', and *its* caller will be
|
||||||
|
the 'run_commands' method. If called any other way, the
|
||||||
|
immediate caller *might* be 'run_command', but it won't have been
|
||||||
|
called by 'run_commands'. Return True in that case or if a call stack
|
||||||
|
is unavailable. Return False otherwise.
|
||||||
|
"""
|
||||||
|
if run_frame is None:
|
||||||
|
msg = "Call stack not available. bdist_* commands may fail."
|
||||||
|
warnings.warn(msg)
|
||||||
|
if platform.python_implementation() == 'IronPython':
|
||||||
|
msg = "For best results, pass -X:Frames to enable call stack."
|
||||||
|
warnings.warn(msg)
|
||||||
|
return True
|
||||||
|
res = inspect.getouterframes(run_frame)[2]
|
||||||
|
caller, = res[:1]
|
||||||
|
info = inspect.getframeinfo(caller)
|
||||||
|
caller_module = caller.f_globals.get('__name__', '')
|
||||||
|
return (
|
||||||
|
caller_module == 'distutils.dist'
|
||||||
|
and info.function == 'run_commands'
|
||||||
|
)
|
||||||
|
|
||||||
|
def do_egg_install(self):
|
||||||
|
|
||||||
|
easy_install = self.distribution.get_command_class('easy_install')
|
||||||
|
|
||||||
|
cmd = easy_install(
|
||||||
|
self.distribution, args="x", root=self.root, record=self.record,
|
||||||
|
)
|
||||||
|
cmd.ensure_finalized() # finalize before bdist_egg munges install cmd
|
||||||
|
cmd.always_copy_from = '.' # make sure local-dir eggs get installed
|
||||||
|
|
||||||
|
# pick up setup-dir .egg files only: no .egg-info
|
||||||
|
cmd.package_index.scan(glob.glob('*.egg'))
|
||||||
|
|
||||||
|
self.run_command('bdist_egg')
|
||||||
|
args = [self.distribution.get_command_obj('bdist_egg').egg_output]
|
||||||
|
|
||||||
|
if setuptools.bootstrap_install_from:
|
||||||
|
# Bootstrap self-installation of setuptools
|
||||||
|
args.insert(0, setuptools.bootstrap_install_from)
|
||||||
|
|
||||||
|
cmd.args = args
|
||||||
|
cmd.run()
|
||||||
|
setuptools.bootstrap_install_from = None
|
||||||
|
|
||||||
|
|
||||||
|
# XXX Python 3.1 doesn't see _nc if this is inside the class
|
||||||
|
install.sub_commands = (
|
||||||
|
[cmd for cmd in orig.install.sub_commands if cmd[0] not in install._nc] +
|
||||||
|
install.new_commands
|
||||||
|
)
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user