From eee03cac59ebbf2320bc5ca3a899fd08ba3c246a Mon Sep 17 00:00:00 2001 From: Pluto <113933026+PlutoNbai@users.noreply.github.com> Date: Fri, 3 May 2024 14:42:23 -0400 Subject: [PATCH 01/10] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index dde9a049..02aa2da1 100644 --- a/README.md +++ b/README.md @@ -21,7 +21,7 @@ Setting up the PYTHON SWAN SDK is straightforward. **Install via PyPI testnet:** ```bash -[pip install -i https://test.pypi.org/simple/ orchestrator-sdk](https://pypi.org/project/swan-sdk/0.0.2/) +pip install swan-sdk ``` **Clone from GitHub:** From 230180db353af8b06766cc095e5a4ea7b2131617 Mon Sep 17 00:00:00 2001 From: Zihang Chen <159826530+ZihangChenNBAI@users.noreply.github.com> Date: Fri, 3 May 2024 15:26:59 -0400 Subject: [PATCH 02/10] removed unused code and updated readme --- README.md | 201 ++++++++++--- docs/usage.md | 240 +--------------- examples/example-demo.ipynb | 366 ++++++----------------- swan/__init__.py | 1 - swan/api/mcs_api.py | 519 --------------------------------- swan/api/swan_api.py | 2 +- swan/api_client.py | 10 +- swan/common/__init__.py | 15 +- swan/common/constant.py | 34 +-- swan/common/exception.py | 44 +-- swan/common/utils.py | 2 - swan/object/__init__.py | 3 +- swan/object/source_uri.py | 369 ------------------------ test/dev_test.ipynb | 557 ------------------------------------ test/github_test.ipynb | 96 ------- test/mcs_test.ipynb | 352 ----------------------- test/nsfw_test.py | 10 - test/source.json | 1 - 18 files changed, 270 insertions(+), 2552 deletions(-) delete mode 100644 swan/api/mcs_api.py delete mode 100644 swan/object/source_uri.py delete mode 100644 test/dev_test.ipynb delete mode 100644 test/github_test.ipynb delete mode 100644 test/mcs_test.ipynb delete mode 100644 test/nsfw_test.py delete mode 100644 test/source.json diff --git a/README.md b/README.md index efcad2ed..2d875363 100644 --- a/README.md +++ b/README.md @@ -3,6 +3,25 @@ [![Made by FilSwan](https://img.shields.io/badge/made%20by-FilSwan-green.svg)](https://www.filswan.com/) [![Chat on discord](https://img.shields.io/badge/join%20-discord-brightgreen.svg)](https://discord.com/invite/KKGhy8ZqzK) +## Table Of Contents +- [Overview](#overview) +- [Features](#features) +- [Installation](#installation) +- [Quick Guide](#quick-start-guide-sdk-v2) + 1. [Get SwanHub API Key](#1-get-swanhub-api-key) + 2. [Login to SwanHub](#2-login-into-swanhub-through-sdk) + 3. [Use Swan Payment Contract](#3-connect-to-swan-payment-contract) + 4. [Retrieve CP Hardware Info](#4-retrieve-avaliable-hardware-informaitons) + 5. [Get Job Source URI](#5-get-job_source_uri) + 6. [Esitmate Task Payment](#6-esitmate-payment-amount) + 7. [Create Task](#7-create-task) + 8. [Submit Payment](#8-submit-payment) + 9. [Validate Payment and Delpoy Task](#9-validate-payment-to-deploy-task) + 10. [Follow Up Deployed Task Status (Optional)](#10-follow-up-task-status-optional) +- [Documentation](#documentation) +- [Contribution](#contributions) +- [License](#license) + ## Overview The SWAN SDK provides a streamlined and efficient interface for interacting with our API. It's tailored for easy creation and management of CP tasks, making it a versatile tool for a wide range of applications. @@ -21,7 +40,7 @@ Install the SDK with ease. From pypi testnet: ```bash -pip install -i https://test.pypi.org/simple/ orchestrator-sdk +pip install -i swan-sdk ``` Install from Github: @@ -31,56 +50,172 @@ git clone https://github.com/swanchain/orchestrator-sdk.git git checkout dev ``` -## Quick Start Guide SDK V1 +## Quick Start Guide SDK V2 Jump into using the SDK with this quick example: -```python +### 1. Get SwanHub API Key + +To use `swan-sdk` SwanHub API key is required. +- Go to Swan Dashboard: https://orchestrator.swanchain.io/provider-status +- Login through MetaMask. +- Click the user icon on top right. +- Click 'Show API-Key' -> 'New API Key' +- Store your API Key safely, do not share with others. +### 2. Login into SwanHub Through SDK + +To use `swan-sdk` you will need to login to SwanHub using API Key. (Wallet login is not supported) + +```python from swan import SwanAPI -# Initialize the Swan Service -# Perform verification and retrieve signed contract address store in SWANAPI.contract_info -swan_api = SwanAPI(api_key='') +swan_api = SwanAPI(api_key="") +``` + +### 3. Connect to Swan Payment Contract + +Payment of SwanHub deployment is paid through Swan Payment Contract. To navigate the contract ABIs. First create a `SwanContract()` instance: +```python +from swan.contract.swan_contract import SwanContract + +contract = SwanContract('', swan_api.contract_info) +``` -# Retrieve List of Hardwares +### 4. Retrieve Avaliable Hardware Informaitons + +SwanHub provides selection of Computing Providers with different hardwares. +Use `SwanAPI().get_hardware_config()` to retrieve all avaliable hardwares on SwanHub. + +Each hardware is stored in `HardwareConfig()` object. +```python +from swan.object import HardwareConfig +``` + +Hardware config contains an unique hardware ID, hardware name, description, hardware type (CPU/GPU), price per hour, avaliable region and current status. + +See all avaliable hardware in a python dictionary: +```python hardwares = swan_api.get_hardware_config() -price_list = [(hardware.name, hardware.price) for hardware in hardwares] +hardwares_info = [hardware.to_dict() for hardware in hardwares if hardware.status == "available"] +hardwares_info +``` +`HardwareConfig().status` shows the avalibility of the hardware. +`HardwareConfig().region` is a list of all region this hardware is avaliable in. -# Deploy task -# tx_hash from lock_payment from swan payment cotract (see demo code below) -result = swan_api.deploy_task(cfg_name='', region='', start_in=123, duration=123, job_source_uri='', paid=123, tx_hash='', wallet_address='') -print(result) +Retrieve the hardware with hardware id 0: +```python +hardwares = swan_api.get_hardware_config() +chosen_hardware = [hardware for hardware in hardwares if hardware.id == 0] +chosen_hardware.to_dict() +``` -# Check task info -swan_api.get_deployment_info(task_uuid='') +Sample output: ``` -Lock Swan Token onchain: +{'id': 0, + 'name': 'C1ae.small', + 'description': 'CPU only · 2 vCPU · 2 GiB', + 'type': 'CPU', + 'reigion': ['North Carolina-US', ...], + 'price': '0.0', + 'status': 'available' +} +``` + +### 5. Get job_source_uri + +`job_source_uri` can be create through `SwanAPI().get_source_uri()` API. + +Generate a source URI +A demo tetris docker image on GitHub as repo_uri: 'https://github.com/alphaflows/tetris-docker-image.git' +```python +job_source_uri = swan_api.get_source_uri( + repo_uri='', + hardware_id=chosen_hardware.id, + wallet_address='' +) + +job_source_uri = job_source_uri['data']['job_source_uri'] +``` + +### 6. Esitmate Payment Amount +To estimate the payment required for the deployment. Use `SwanContract().estiamte_payment()` +```python +duration_hour = 1 # or duration you want the deployment to run +amount = contract.estimate_payment(chosen_hardware.id, duration_hour) +amount # amount is in wei, 18 decimals +``` + +### 7. Create Task +Before paying for the task. First create a task on SwanHub using desired task attributes. ```python -swan_contract = SwanContract(private_key='', contract_info=swan_api.contract_info) +import json + +duration = 3600*duration_hour +cfg_name = chosen_hardware.name + +result = swan_api.create_task( + cfg_name=cfg_name, + region='', + start_in=300, # in seconds + duration=duration, + job_source_uri=job_source_uri, #repo.source_uri + paid=contract._wei_to_swan(amount), # from wei to swan amount/1e18 + wallet_address='', +) +task_uuid = result['data']['task']['uuid'] + +print(json.dumps(result, indent=2)) # Print response +``` -# Test esimate lock revenue -estimation = swan_contract.estimate_payment(hardware_id=1, duration=10) -print(estimation*1e-18) +Sample output: +``` +{ + "data": { + "task": { + "created_at": "1714254304", + "end_at": "1714257898", + "leading_job_id": null, + "refund_amount": null, + "status": "initialized", + "task_detail_cid": "https://data.mcs.lagrangedao.org/ipfs/QmXLSaBqtoWZWAUoiYxM3EDxh14kkhpUiYkVjZSK3BhfKj", + "tx_hash": null, + "updated_at": "1714254304", + "uuid": "f4799212-4dc2-4c0b-9209-c0ac7bc48442" + } + }, + "message": "Task_uuid initialized.", + "status": "success" +} +``` -# Test get hardware info -hardware_info = swan_contract.hardware_info(1) -hardware_info +The `task['uuid']` will be used in following operations. -# Test get swan token balance -balance = swan_contract._get_swan_balance() -print(balance*1e-18) +### 8. Submit Payment -# Test get gas -gas = swan_contract._get_swan_gas() -print(gas*1e-18) +Use `SwanContract().submit_payment()` to pay for the task. The TX hash is the receipt for the payment. +```python +tx_hash = contract.submit_payment(task_uuid, hardware_id, duration) +``` -# Approve Swan Token -tx_hash = swan_contract._approve_swan_token(amount=100) -print(tx_hash) +### 9. Validate Payment to Deploy Task -# Lock payment -r = swan_contract.lock_revenue(task_id='1', hardware_id=1, duration=0) +Use `SwanAPI().validate_payment()` to validate the payment using TX hash and deploy the task. +```python +swan_api.validate_payment( + tx_hash=tx_hash, + task_uuid=task_uuid +) +``` + +### 10. Follow up Task Status (Optional) + +#### Show results + +Get the deploy URI to test your task deployment using `SwanAPI().get_real_uri()`. +```python +r = swan_api.get_real_url(task_uuid) +print(r) ``` For more detailed examples, visit the docs/ directory. diff --git a/docs/usage.md b/docs/usage.md index 31f0a68e..5b99a679 100644 --- a/docs/usage.md +++ b/docs/usage.md @@ -1,241 +1,3 @@ # Using Swan SDK V1 APIs -## Table Of Contents -1. [Swan Orchestrator APIs](#using-swan-orchestrator-apis) - - [Login](#login-to-swan-orchestrator-with-api-key) - - [Hardware/CP Info](#retrieving-cp-machine-hardware-info) -2. [MCS APIs](#using-mcs-apis) - - [Login](#login-to-mcs-with-api-key) - - [MultichainStorage Bucket](#create-mcs-bucket) -3. [Generate Source URI](#generate-task-source-uri) - - [From Github Repo](#use-github-repository) - - [From Lagrange Space](#use-lagrange-space) - - [From Local Directory (MCS)](#create-new-scource-uri-from-local-directory) - - [From Existing MCS File](#create-source-uri-with-exisiting-mcs-directory) -4. [Delpoy CP Task Through Orchestrator](#deploying-task-through-orchestrator) - -## Using Swan Orchestrator APIs - -Swan Orchestrator APIs allow user to check machine configuration and deploy tasks. - -### Login to Swan Orchestrator with API Key - -Swan SDK can only login to Orchestrator through official API key. -Generate API key using your wallet on Swan Orchestrator website. - -```python -from swan import SwanAPI - -api_key = -swan_api = SwanAPI(api_key) -``` - -### Retrieving CP Machine Hardware Info - -Hardware information can be retrieved from Swan Orchestrator API. -Task can only be deployed in any region when the choosen hardware is avaliable in that region. -HardwareConfig().region contains a list of avalibale region for choosen hardware. - -```python -# Returns a list of HardwareConfig() objects -hardwares = swan_api.get_hardware_config() -# To get all hardware name and price -price_list = [(hardware.name, hardware.price) for hardware in hardwares] -``` - -HardwareConfig() object contains: -```python -{ - "id": , - "name": , - "description": , - "type": , # CPU/GPU - "reigion": , - "price": , - "status": -} -``` - -To retrieve specific hardware infomation - -```python -hardware_attribute = HardwareConfig(). -``` - -Dictionary object or JSON object of hardware: -```python -HardwareConfig().to_dict() - -HardwareConfig().to_json() -``` - -## Using MCS APIs - -MCS provides file storage on IPFS server. - -### Login to MCS with API Key -MCS SDK can only login to multichain.storage through official API key. - -```python -from swan import MCSAPI - -api_key = -mcs_api = MCSAPI(api_key) -``` - -### Create MCS Bucket -Create MCS bucket for all mcs related operation. - -```python -mcs_api.create_bucket() -``` - -## Generate Task Source URI -To deploy task on Swan Orchestrator. Remote source is required. Task deployment -API requires source uri, which should contain a .json file with deployment information. - -Source URI can be generated using Swan SDK. - -### Use GitHub Repository - -#### Create Github Repo Object -```python -from swan.object.source_uri import GithubRepo - -# Connect to Github Repo -# Hardware ID can be retrieve from SwanAPI get hardware config -repo = GithubRepo("", "", "", "", hardware_id:int=1) - -# Retrieve structure of Github Repo -repo.get_github_tree() -``` - -#### Upload Source URI to MCS - -Use MCS bucket to create a retrivable source URI for CP. - -```python -# Upload Directory to MCS -response = repo.generate_source_uri(bucket_name="", obj_name="", file_path="", mcs_client=mcs_api, replace=True) - -print(repo.source_uri) -``` - -### Use Lagrange Space - -#### Create Lagrange Space Object - -```python -from swan.object.source_uri import LagrangeSpace - -# Connect to Lagrange space -# Hardware ID can be retrieve from SwanAPI get hardware config -lag = LagrangeSpace('', '', "", hardware_id:int=1) - -# Retrieve folder structure and file details -lag.get_space_info() -``` - -#### Upload Source URI to MCS - -Use MCS bucket to create a retrivable source URI for CP. - -```python -# Upload Directory to MCS -response = repo.generate_source_uri(bucket_name="", obj_name="", file_path="", mcs_client=mcs_api, replace=True) - -print(repo.source_uri) -``` - -### Create New Scource URI From Local Directory - -#### Upload Local Repository - -Create an online repository (folder) to store your project on MCS. - -```python -from swan.object import Repository - -repo = Repository() - -# Add local directory -repo.add_local_dir('') - -# Upload Directory to MCS -repo.upload_local_to_mcs(, , mcs_api) -``` - -#### Upload Task Source .json and Retireve Source URI - -To get source URI use MCS with current repository. Simply call generate_source_uri(). -This function will create .json file locally contains all neccessary source information -and upload to provided MCS directory. - -```python -# Upload source -response = repo.generate_source_uri(, , , mcs_api) - -# Output source URI -repo.source_uri -``` - -### Create Source URI with Exisiting MCS Directory - -Use SourceFilesInfo() to create Source URI manually with an existing MCS directory. - -```python -source = Repository() - -# Connect to MCS -source.mcs_connection(mcs_api) - -# Add MCS folder -source.update_bucket_info(, ) - -# Get source URI -response = source.generate_source_uri(, , , mcs_api) - -# Output source URI -source.source_uri -``` - -## Pay for Swan Orchestrator Task - -```python -from swan import SwanContract - -contract = SwanContract(, ) - -# Get price of hardware -# Hardware id can be retrieved from Orchestrator API shown above -price = contract.hardware_info() - -# Get an estimate of payment in wei -estimate = contract.estimate_payment(, ) -print(estimate*1e-18) - -# Approve token -tx_hash = contract._approve_swan_token() - -# Payment -tx_hash = contract.lock_revenue(, , ) -``` - -## Deploying Task Through Orchestrator - -### Deployment - -To deploy task with source URI to Swan Orchestrator. Use SwanAPI.deploy_task. - -```python -response = swan_api.deploy_task(cfg_name=, \ - region=, start_in=, duration=, \ - job_source_uri=, paid=) - -task_uuid = response['data']['task']['uuid'] -``` - -### Check Deployment Status -```python -response = swan_api.get_deployment_info(task_uuid) -``` \ No newline at end of file +## Table Of Contents \ No newline at end of file diff --git a/examples/example-demo.ipynb b/examples/example-demo.ipynb index 8bb4529e..03bc57bd 100644 --- a/examples/example-demo.ipynb +++ b/examples/example-demo.ipynb @@ -94,226 +94,27 @@ { "data": { "text/plain": [ - "[{'id': 0,\n", - " 'name': 'C1ae.small',\n", - " 'description': 'CPU only · 2 vCPU · 2 GiB',\n", - " 'type': 'CPU',\n", - " 'reigion': ['Ivano-Frankivsk Oblast-UA',\n", - " 'North Rhine-Westphalia-DE',\n", - " 'Henan-CN',\n", - " 'Kowloon City-HK',\n", - " 'Guangdong-CN',\n", - " 'Kowloon-HK',\n", - " 'Saxony-DE',\n", - " 'Central and Western District-HK',\n", - " 'North Carolina-US',\n", - " 'California-US',\n", - " 'Bashkortostan Republic-RU',\n", - " 'Jiangsu-CN',\n", - " 'North West-SG',\n", - " 'Kyiv City-UA',\n", - " 'Illinois-US',\n", - " 'Bavaria-DE',\n", - " 'North Holland-NL',\n", - " 'Quebec-CA'],\n", - " 'price': '0.0',\n", - " 'status': 'available'},\n", - " {'id': 1,\n", - " 'name': 'C1ae.medium',\n", - " 'description': 'CPU only · 4 vCPU · 4 GiB',\n", - " 'type': 'CPU',\n", - " 'reigion': ['Ivano-Frankivsk Oblast-UA',\n", - " 'North Rhine-Westphalia-DE',\n", - " 'Henan-CN',\n", - " 'Kowloon City-HK',\n", - " 'Guangdong-CN',\n", - " 'Kowloon-HK',\n", - " 'Central and Western District-HK',\n", - " 'North Carolina-US',\n", - " 'California-US',\n", - " 'Bashkortostan Republic-RU',\n", - " 'Jiangsu-CN',\n", - " 'North West-SG',\n", - " 'Kyiv City-UA',\n", - " 'Illinois-US',\n", - " 'Bavaria-DE',\n", - " 'North Holland-NL',\n", - " 'Quebec-CA'],\n", - " 'price': '1.0',\n", - " 'status': 'available'},\n", - " {'id': 4,\n", - " 'name': 'M1ae.large',\n", - " 'description': 'Nvidia 3060 · 8 vCPU · 8 GiB',\n", - " 'type': 'GPU',\n", - " 'reigion': ['Kyiv City-UA'],\n", - " 'price': '4.0',\n", - " 'status': 'available'},\n", - " {'id': 6,\n", - " 'name': 'M1ae.2xlarge',\n", - " 'description': 'Nvidia 2080 Ti · 4 vCPU · 8 GiB',\n", - " 'type': 'GPU',\n", - " 'reigion': ['North Carolina-US'],\n", - " 'price': '6.0',\n", - " 'status': 'available'},\n", - " {'id': 7,\n", - " 'name': 'M1ae.3xlarge',\n", - " 'description': 'Nvidia 2080 Ti · 8 vCPU · 16 GiB',\n", - " 'type': 'GPU',\n", - " 'reigion': ['North Carolina-US'],\n", - " 'price': '6.5',\n", - " 'status': 'available'},\n", - " {'id': 12,\n", - " 'name': 'G1ae.small',\n", - " 'description': 'Nvidia 3080 · 4 vCPU · 8 GiB',\n", - " 'type': 'GPU',\n", - " 'reigion': ['Kowloon City-HK',\n", - " 'Kowloon-HK',\n", - " 'North Carolina-US',\n", - " 'California-US',\n", - " 'Quebec-CA'],\n", - " 'price': '10.0',\n", - " 'status': 'available'},\n", - " {'id': 13,\n", - " 'name': 'G1ae.medium',\n", - " 'description': 'Nvidia 3080 · 8 vCPU · 16 GiB',\n", - " 'type': 'GPU',\n", - " 'reigion': ['Kowloon City-HK',\n", - " 'Kowloon-HK',\n", - " 'North Carolina-US',\n", - " 'California-US',\n", - " 'Quebec-CA'],\n", - " 'price': '11.0',\n", - " 'status': 'available'},\n", - " {'id': 20,\n", - " 'name': 'Hpc1ae.small',\n", - " 'description': 'Nvidia 3090 · 4 vCPU · 8 GiB',\n", - " 'type': 'GPU',\n", - " 'reigion': ['North Rhine-Westphalia-DE',\n", - " 'Guangdong-CN',\n", - " 'Kowloon-HK',\n", - " 'California-US',\n", - " 'North Carolina-US',\n", - " 'Quebec-CA'],\n", - " 'price': '14.0',\n", - " 'status': 'available'},\n", - " {'id': 21,\n", - " 'name': 'Hpc1ae.medium',\n", - " 'description': 'Nvidia 3090 · 8 vCPU · 16 GiB',\n", - " 'type': 'GPU',\n", - " 'reigion': ['North Rhine-Westphalia-DE',\n", - " 'Guangdong-CN',\n", - " 'Kowloon-HK',\n", - " 'California-US',\n", - " 'North Carolina-US',\n", - " 'Quebec-CA'],\n", - " 'price': '16.0',\n", - " 'status': 'available'},\n", - " {'id': 24,\n", - " 'name': 'Hpc1ae.2xlarge',\n", - " 'description': 'NVIDIA A4000 · 4 vCPU · 8 GiB',\n", - " 'type': 'AI GPU',\n", - " 'reigion': ['North Carolina-US'],\n", - " 'price': '21.0',\n", - " 'status': 'available'},\n", - " {'id': 25,\n", - " 'name': 'Hpc1ae.3xlarge',\n", - " 'description': 'NVIDIA A4000 · 8 vCPU · 16 GiB',\n", - " 'type': 'AI GPU',\n", - " 'reigion': ['North Carolina-US'],\n", - " 'price': '21.0',\n", - " 'status': 'available'},\n", - " {'id': 27,\n", - " 'name': 'T1ae.medium',\n", - " 'description': 'Nvidia 2080 Ti · 12 vCPU · 64 GiB',\n", - " 'type': 'GPU',\n", - " 'reigion': ['North Carolina-US'],\n", - " 'price': '36.0',\n", - " 'status': 'available'},\n", - " {'id': 32,\n", - " 'name': 'Hpc2ae.small',\n", - " 'description': 'Nvidia 4090 · 4 vCPU · 8 GiB',\n", - " 'type': 'AI GPU',\n", - " 'reigion': ['Bavaria-DE', 'Henan-CN', 'Bashkortostan Republic-RU'],\n", - " 'price': '22.0',\n", - " 'status': 'available'},\n", - " {'id': 33,\n", - " 'name': 'Hpc2ae.medium',\n", - " 'description': 'Nvidia 4090 · 8 vCPU · 16 GiB',\n", - " 'type': 'AI GPU',\n", - " 'reigion': ['Bavaria-DE', 'Henan-CN', 'Bashkortostan Republic-RU'],\n", - " 'price': '24.0',\n", - " 'status': 'available'},\n", - " {'id': 42,\n", - " 'name': 'T1az.2xlarge',\n", - " 'description': 'Nvidia 4090 · 8 vCPU · 64 GiB',\n", - " 'type': 'AI GPU',\n", - " 'reigion': ['Bavaria-DE', 'Henan-CN', 'Bashkortostan Republic-RU'],\n", - " 'price': '60.0',\n", - " 'status': 'available'},\n", - " {'id': 44,\n", - " 'name': 'T1az.4xlarge',\n", - " 'description': 'Nvidia A4000 · 8 vCPU · 64 GiB',\n", - " 'type': 'AI GPU',\n", - " 'reigion': ['North Carolina-US'],\n", - " 'price': '65.0',\n", - " 'status': 'available'},\n", - " {'id': 53,\n", - " 'name': 'T2az.2xlarge',\n", - " 'description': 'Nvidia 4090 · 12 vCPU · 128 GiB',\n", - " 'type': 'AI GPU',\n", - " 'reigion': ['Bavaria-DE'],\n", - " 'price': '70.0',\n", - " 'status': 'available'},\n", - " {'id': 55,\n", - " 'name': 'T2az.4xlarge',\n", - " 'description': 'Nvidia A4000 · 12 vCPU · 128 GiB',\n", - " 'type': 'AI GPU',\n", - " 'reigion': ['North Carolina-US'],\n", - " 'price': '75.0',\n", - " 'status': 'available'},\n", - " {'id': 72,\n", - " 'name': 'R1ae.small',\n", - " 'description': 'Nvidia 2080 TI · 8 vCPU · 32 GiB',\n", - " 'type': 'GPU',\n", - " 'reigion': ['North Carolina-US'],\n", - " 'price': '12.0',\n", - " 'status': 'available'},\n", - " {'id': 73,\n", - " 'name': 'R1ae.medium',\n", - " 'description': 'Nvidia 3080 · 8 vCPU · 32 GiB',\n", - " 'type': 'GPU',\n", - " 'reigion': ['North Carolina-US',\n", - " 'California-US',\n", - " 'Kowloon City-HK',\n", - " 'Quebec-CA'],\n", - " 'price': '22.0',\n", - " 'status': 'available'},\n", - " {'id': 74,\n", - " 'name': 'R1ae.large',\n", - " 'description': 'Nvidia 3090 · 8 vCPU · 32 GiB',\n", - " 'type': 'GPU',\n", - " 'reigion': ['North Rhine-Westphalia-DE',\n", - " 'Guangdong-CN',\n", - " 'Kowloon-HK',\n", - " 'California-US',\n", - " 'North Carolina-US',\n", - " 'Quebec-CA'],\n", - " 'price': '30.0',\n", - " 'status': 'available'},\n", - " {'id': 77,\n", - " 'name': 'R2ae.large',\n", - " 'description': 'Nvidia 4090 · 8 vCPU · 32 GiB',\n", - " 'type': 'AI GPU',\n", - " 'reigion': ['Bavaria-DE', 'Henan-CN', 'Bashkortostan Republic-RU'],\n", - " 'price': '50.0',\n", - " 'status': 'available'},\n", - " {'id': 78,\n", - " 'name': 'R2ae.xlarge',\n", - " 'description': 'Nvidia A4000 · 8 vCPU · 32 GiB',\n", - " 'type': 'AI GPU',\n", - " 'reigion': ['North Carolina-US'],\n", - " 'price': '52.0',\n", - " 'status': 'available'}]" + "{'id': 0,\n", + " 'name': 'C1ae.small',\n", + " 'description': 'CPU only · 2 vCPU · 2 GiB',\n", + " 'type': 'CPU',\n", + " 'reigion': ['North Carolina-US',\n", + " 'Bashkortostan Republic-RU',\n", + " 'Kyiv City-UA',\n", + " 'Kowloon City-HK',\n", + " 'Tokyo-JP',\n", + " 'California-US',\n", + " 'Central and Western District-HK',\n", + " 'Quebec-CA',\n", + " 'North West-SG',\n", + " 'Kwai Tsing-HK',\n", + " 'Bavaria-DE',\n", + " 'Saxony-DE',\n", + " 'Guangdong-CN',\n", + " 'Kowloon-HK',\n", + " 'North Rhine-Westphalia-DE'],\n", + " 'price': '0.0',\n", + " 'status': 'available'}" ] }, "execution_count": 3, @@ -340,14 +141,14 @@ }, { "cell_type": "code", - "execution_count": 4, + "execution_count": 6, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ - "hardware.name='C1ae.medium', hardware.id=1, ['Ivano-Frankivsk Oblast-UA', 'North Rhine-Westphalia-DE', 'Henan-CN', 'Kowloon City-HK', 'Guangdong-CN', 'Kowloon-HK', 'Central and Western District-HK', 'North Carolina-US', 'California-US', 'Bashkortostan Republic-RU', 'Jiangsu-CN', 'North West-SG', 'Kyiv City-UA', 'Illinois-US', 'Bavaria-DE', 'North Holland-NL', 'Quebec-CA']\n", + "hardware.name='C1ae.medium', hardware.id=1, ['North Carolina-US', 'Bashkortostan Republic-RU', 'Kyiv City-UA', 'Kowloon City-HK', 'Tokyo-JP', 'California-US', 'Central and Western District-HK', 'Quebec-CA', 'North West-SG', 'Kwai Tsing-HK', 'Bavaria-DE', 'Guangdong-CN', 'Kowloon-HK', 'North Rhine-Westphalia-DE']\n", "The chosen hardware_id=1\n" ] } @@ -363,7 +164,7 @@ }, { "cell_type": "code", - "execution_count": 5, + "execution_count": 15, "metadata": {}, "outputs": [ { @@ -372,7 +173,7 @@ "1" ] }, - "execution_count": 5, + "execution_count": 15, "metadata": {}, "output_type": "execute_result" } @@ -392,13 +193,13 @@ }, { "cell_type": "code", - "execution_count": 6, + "execution_count": 7, "metadata": {}, "outputs": [], "source": [ "\n", "job_source_uri = swan_api.get_source_uri(\n", - " repo_uri='https://github.com/alphaflows/tetris-docker-image.git',\n", + " repo_uri='https://github.com/alphaflows/tetris-docker-image',\n", " hardware_id=hardware_id,\n", " wallet_address=os.getenv('WALLET')\n", ")" @@ -406,7 +207,7 @@ }, { "cell_type": "code", - "execution_count": 7, + "execution_count": 8, "metadata": {}, "outputs": [], "source": [ @@ -416,9 +217,20 @@ }, { "cell_type": "code", - "execution_count": 8, + "execution_count": 9, "metadata": {}, - "outputs": [], + "outputs": [ + { + "data": { + "text/plain": [ + "'https://data.mcs.lagrangedao.org/ipfs/QmSc5G8YdR4d6WQFJm844FqS4F8qAGWpo5YXr5auVamh9M'" + ] + }, + "execution_count": 9, + "metadata": {}, + "output_type": "execute_result" + } + ], "source": [ "job_source_uri" ] @@ -432,7 +244,7 @@ }, { "cell_type": "code", - "execution_count": 9, + "execution_count": 19, "metadata": {}, "outputs": [ { @@ -460,36 +272,30 @@ }, { "cell_type": "code", - "execution_count": 10, + "execution_count": 20, "metadata": {}, "outputs": [ { - "ename": "KeyboardInterrupt", - "evalue": "", - "output_type": "error", - "traceback": [ - "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", - "\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)", - "Cell \u001b[0;32mIn[10], line 5\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[38;5;28;01mimport\u001b[39;00m \u001b[38;5;21;01mjson\u001b[39;00m\n\u001b[1;32m 3\u001b[0m duration\u001b[38;5;241m=\u001b[39m\u001b[38;5;241m3600\u001b[39m\u001b[38;5;241m*\u001b[39mduration_hour\n\u001b[0;32m----> 5\u001b[0m result \u001b[38;5;241m=\u001b[39m \u001b[43mswan_api\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mcreate_task\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 6\u001b[0m \u001b[43m \u001b[49m\u001b[43mcfg_name\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mcfg_name\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\n\u001b[1;32m 7\u001b[0m \u001b[43m \u001b[49m\u001b[43mregion\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;124;43m'\u001b[39;49m\u001b[38;5;124;43mNorth Carolina-US\u001b[39;49m\u001b[38;5;124;43m'\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\n\u001b[1;32m 8\u001b[0m \u001b[43m \u001b[49m\u001b[43mstart_in\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;241;43m300\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\n\u001b[1;32m 9\u001b[0m \u001b[43m \u001b[49m\u001b[43mduration\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mduration\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\n\u001b[1;32m 10\u001b[0m \u001b[43m \u001b[49m\u001b[43mjob_source_uri\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mjob_source_uri\u001b[49m\u001b[43m,\u001b[49m\u001b[38;5;66;43;03m#repo.source_uri, \u001b[39;49;00m\n\u001b[1;32m 11\u001b[0m \u001b[43m \u001b[49m\u001b[43mpaid\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mcontract\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_wei_to_swan\u001b[49m\u001b[43m(\u001b[49m\u001b[43mamount\u001b[49m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 12\u001b[0m \u001b[43m \u001b[49m\u001b[43mwallet_address\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mos\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mgetenv\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;124;43m'\u001b[39;49m\u001b[38;5;124;43mWALLET\u001b[39;49m\u001b[38;5;124;43m'\u001b[39;49m\u001b[43m)\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 13\u001b[0m \u001b[43m)\u001b[49m\n\u001b[1;32m 14\u001b[0m \u001b[38;5;28mprint\u001b[39m(json\u001b[38;5;241m.\u001b[39mdumps(result, indent\u001b[38;5;241m=\u001b[39m\u001b[38;5;241m2\u001b[39m))\n\u001b[1;32m 15\u001b[0m task_uuid \u001b[38;5;241m=\u001b[39m result[\u001b[38;5;124m'\u001b[39m\u001b[38;5;124mdata\u001b[39m\u001b[38;5;124m'\u001b[39m][\u001b[38;5;124m'\u001b[39m\u001b[38;5;124mtask\u001b[39m\u001b[38;5;124m'\u001b[39m][\u001b[38;5;124m'\u001b[39m\u001b[38;5;124muuid\u001b[39m\u001b[38;5;124m'\u001b[39m]\n", - "File \u001b[0;32m~/Documents/work_repos/orchestrator-sdk/examples/../swan/api/swan_api.py:193\u001b[0m, in \u001b[0;36mSwanAPI.create_task\u001b[0;34m(self, cfg_name, region, start_in, duration, job_source_uri, wallet_address, paid)\u001b[0m\n\u001b[1;32m 183\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_verify_hardware_region(cfg_name, region):\n\u001b[1;32m 184\u001b[0m params \u001b[38;5;241m=\u001b[39m {\n\u001b[1;32m 185\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mpaid\u001b[39m\u001b[38;5;124m\"\u001b[39m: paid,\n\u001b[1;32m 186\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mduration\u001b[39m\u001b[38;5;124m\"\u001b[39m: duration,\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 191\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mjob_source_uri\u001b[39m\u001b[38;5;124m\"\u001b[39m: job_source_uri\n\u001b[1;32m 192\u001b[0m }\n\u001b[0;32m--> 193\u001b[0m result \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_request_with_params\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 194\u001b[0m \u001b[43m \u001b[49m\u001b[43mPOST\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\n\u001b[1;32m 195\u001b[0m \u001b[43m \u001b[49m\u001b[43mCREATE_TASK\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\n\u001b[1;32m 196\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mswan_url\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\n\u001b[1;32m 197\u001b[0m \u001b[43m \u001b[49m\u001b[43mparams\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\n\u001b[1;32m 198\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mtoken\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\n\u001b[1;32m 199\u001b[0m \u001b[43m \u001b[49m\u001b[38;5;28;43;01mNone\u001b[39;49;00m\n\u001b[1;32m 200\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 201\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m result\n\u001b[1;32m 202\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n", - "File \u001b[0;32m~/Documents/work_repos/orchestrator-sdk/examples/../swan/api_client.py:113\u001b[0m, in \u001b[0;36mAPIClient._request_with_params\u001b[0;34m(self, method, request_path, swan_api, params, token, files, json_body)\u001b[0m\n\u001b[1;32m 112\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21m_request_with_params\u001b[39m(\u001b[38;5;28mself\u001b[39m, method, request_path, swan_api, params, token, files, json_body\u001b[38;5;241m=\u001b[39m\u001b[38;5;28;01mFalse\u001b[39;00m):\n\u001b[0;32m--> 113\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_request\u001b[49m\u001b[43m(\u001b[49m\u001b[43mmethod\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mrequest_path\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mswan_api\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mparams\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mtoken\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mfiles\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mjson_body\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mjson_body\u001b[49m\u001b[43m)\u001b[49m\n", - "File \u001b[0;32m~/Documents/work_repos/orchestrator-sdk/examples/../swan/api_client.py:39\u001b[0m, in \u001b[0;36mAPIClient._request\u001b[0;34m(self, method, request_path, swan_api, params, token, files, json_body)\u001b[0m\n\u001b[1;32m 37\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[1;32m 38\u001b[0m body \u001b[38;5;241m=\u001b[39m params\n\u001b[0;32m---> 39\u001b[0m response \u001b[38;5;241m=\u001b[39m \u001b[43mrequests\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mpost\u001b[49m\u001b[43m(\u001b[49m\u001b[43murl\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mdata\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mbody\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mheaders\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mheader\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 40\u001b[0m \u001b[38;5;28;01melif\u001b[39;00m method \u001b[38;5;241m==\u001b[39m DELETE:\n\u001b[1;32m 41\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m params:\n", - "File \u001b[0;32m~/Documents/work_repos/orchestrator-sdk/venv/lib/python3.10/site-packages/requests/api.py:115\u001b[0m, in \u001b[0;36mpost\u001b[0;34m(url, data, json, **kwargs)\u001b[0m\n\u001b[1;32m 103\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21mpost\u001b[39m(url, data\u001b[38;5;241m=\u001b[39m\u001b[38;5;28;01mNone\u001b[39;00m, json\u001b[38;5;241m=\u001b[39m\u001b[38;5;28;01mNone\u001b[39;00m, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs):\n\u001b[1;32m 104\u001b[0m \u001b[38;5;250m \u001b[39m\u001b[38;5;124mr\u001b[39m\u001b[38;5;124;03m\"\"\"Sends a POST request.\u001b[39;00m\n\u001b[1;32m 105\u001b[0m \n\u001b[1;32m 106\u001b[0m \u001b[38;5;124;03m :param url: URL for the new :class:`Request` object.\u001b[39;00m\n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 112\u001b[0m \u001b[38;5;124;03m :rtype: requests.Response\u001b[39;00m\n\u001b[1;32m 113\u001b[0m \u001b[38;5;124;03m \"\"\"\u001b[39;00m\n\u001b[0;32m--> 115\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43mrequest\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[38;5;124;43mpost\u001b[39;49m\u001b[38;5;124;43m\"\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43murl\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mdata\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mdata\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mjson\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mjson\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n", - "File \u001b[0;32m~/Documents/work_repos/orchestrator-sdk/venv/lib/python3.10/site-packages/requests/api.py:59\u001b[0m, in \u001b[0;36mrequest\u001b[0;34m(method, url, **kwargs)\u001b[0m\n\u001b[1;32m 55\u001b[0m \u001b[38;5;66;03m# By using the 'with' statement we are sure the session is closed, thus we\u001b[39;00m\n\u001b[1;32m 56\u001b[0m \u001b[38;5;66;03m# avoid leaving sockets open which can trigger a ResourceWarning in some\u001b[39;00m\n\u001b[1;32m 57\u001b[0m \u001b[38;5;66;03m# cases, and look like a memory leak in others.\u001b[39;00m\n\u001b[1;32m 58\u001b[0m \u001b[38;5;28;01mwith\u001b[39;00m sessions\u001b[38;5;241m.\u001b[39mSession() \u001b[38;5;28;01mas\u001b[39;00m session:\n\u001b[0;32m---> 59\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[43msession\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mrequest\u001b[49m\u001b[43m(\u001b[49m\u001b[43mmethod\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mmethod\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43murl\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43murl\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n", - "File \u001b[0;32m~/Documents/work_repos/orchestrator-sdk/venv/lib/python3.10/site-packages/requests/sessions.py:587\u001b[0m, in \u001b[0;36mSession.request\u001b[0;34m(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert, json)\u001b[0m\n\u001b[1;32m 582\u001b[0m send_kwargs \u001b[38;5;241m=\u001b[39m {\n\u001b[1;32m 583\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mtimeout\u001b[39m\u001b[38;5;124m\"\u001b[39m: timeout,\n\u001b[1;32m 584\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mallow_redirects\u001b[39m\u001b[38;5;124m\"\u001b[39m: allow_redirects,\n\u001b[1;32m 585\u001b[0m }\n\u001b[1;32m 586\u001b[0m send_kwargs\u001b[38;5;241m.\u001b[39mupdate(settings)\n\u001b[0;32m--> 587\u001b[0m resp \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43msend\u001b[49m\u001b[43m(\u001b[49m\u001b[43mprep\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43msend_kwargs\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 589\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m resp\n", - "File \u001b[0;32m~/Documents/work_repos/orchestrator-sdk/venv/lib/python3.10/site-packages/requests/sessions.py:701\u001b[0m, in \u001b[0;36mSession.send\u001b[0;34m(self, request, **kwargs)\u001b[0m\n\u001b[1;32m 698\u001b[0m start \u001b[38;5;241m=\u001b[39m preferred_clock()\n\u001b[1;32m 700\u001b[0m \u001b[38;5;66;03m# Send the request\u001b[39;00m\n\u001b[0;32m--> 701\u001b[0m r \u001b[38;5;241m=\u001b[39m \u001b[43madapter\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43msend\u001b[49m\u001b[43m(\u001b[49m\u001b[43mrequest\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[38;5;241;43m*\u001b[39;49m\u001b[43mkwargs\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 703\u001b[0m \u001b[38;5;66;03m# Total elapsed time of the request (approximately)\u001b[39;00m\n\u001b[1;32m 704\u001b[0m elapsed \u001b[38;5;241m=\u001b[39m preferred_clock() \u001b[38;5;241m-\u001b[39m start\n", - "File \u001b[0;32m~/Documents/work_repos/orchestrator-sdk/venv/lib/python3.10/site-packages/requests/adapters.py:489\u001b[0m, in \u001b[0;36mHTTPAdapter.send\u001b[0;34m(self, request, stream, timeout, verify, cert, proxies)\u001b[0m\n\u001b[1;32m 487\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m 488\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m chunked:\n\u001b[0;32m--> 489\u001b[0m resp \u001b[38;5;241m=\u001b[39m \u001b[43mconn\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43murlopen\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 490\u001b[0m \u001b[43m \u001b[49m\u001b[43mmethod\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mrequest\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mmethod\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 491\u001b[0m \u001b[43m \u001b[49m\u001b[43murl\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43murl\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 492\u001b[0m \u001b[43m \u001b[49m\u001b[43mbody\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mrequest\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mbody\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 493\u001b[0m \u001b[43m \u001b[49m\u001b[43mheaders\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mrequest\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mheaders\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 494\u001b[0m \u001b[43m \u001b[49m\u001b[43mredirect\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43;01mFalse\u001b[39;49;00m\u001b[43m,\u001b[49m\n\u001b[1;32m 495\u001b[0m \u001b[43m \u001b[49m\u001b[43massert_same_host\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43;01mFalse\u001b[39;49;00m\u001b[43m,\u001b[49m\n\u001b[1;32m 496\u001b[0m \u001b[43m \u001b[49m\u001b[43mpreload_content\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43;01mFalse\u001b[39;49;00m\u001b[43m,\u001b[49m\n\u001b[1;32m 497\u001b[0m \u001b[43m \u001b[49m\u001b[43mdecode_content\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43;01mFalse\u001b[39;49;00m\u001b[43m,\u001b[49m\n\u001b[1;32m 498\u001b[0m \u001b[43m \u001b[49m\u001b[43mretries\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mmax_retries\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 499\u001b[0m \u001b[43m \u001b[49m\u001b[43mtimeout\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mtimeout\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 500\u001b[0m \u001b[43m \u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 502\u001b[0m \u001b[38;5;66;03m# Send the request.\u001b[39;00m\n\u001b[1;32m 503\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[1;32m 504\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;28mhasattr\u001b[39m(conn, \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mproxy_pool\u001b[39m\u001b[38;5;124m\"\u001b[39m):\n", - "File \u001b[0;32m~/Documents/work_repos/orchestrator-sdk/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:715\u001b[0m, in \u001b[0;36mHTTPConnectionPool.urlopen\u001b[0;34m(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, **response_kw)\u001b[0m\n\u001b[1;32m 712\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_prepare_proxy(conn)\n\u001b[1;32m 714\u001b[0m \u001b[38;5;66;03m# Make the request on the httplib connection object.\u001b[39;00m\n\u001b[0;32m--> 715\u001b[0m httplib_response \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_make_request\u001b[49m\u001b[43m(\u001b[49m\n\u001b[1;32m 716\u001b[0m \u001b[43m \u001b[49m\u001b[43mconn\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 717\u001b[0m \u001b[43m \u001b[49m\u001b[43mmethod\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 718\u001b[0m \u001b[43m \u001b[49m\u001b[43murl\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 719\u001b[0m \u001b[43m \u001b[49m\u001b[43mtimeout\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mtimeout_obj\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 720\u001b[0m \u001b[43m \u001b[49m\u001b[43mbody\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mbody\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 721\u001b[0m \u001b[43m \u001b[49m\u001b[43mheaders\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mheaders\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 722\u001b[0m \u001b[43m \u001b[49m\u001b[43mchunked\u001b[49m\u001b[38;5;241;43m=\u001b[39;49m\u001b[43mchunked\u001b[49m\u001b[43m,\u001b[49m\n\u001b[1;32m 723\u001b[0m \u001b[43m\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 725\u001b[0m \u001b[38;5;66;03m# If we're going to release the connection in ``finally:``, then\u001b[39;00m\n\u001b[1;32m 726\u001b[0m \u001b[38;5;66;03m# the response doesn't need to know about the connection. Otherwise\u001b[39;00m\n\u001b[1;32m 727\u001b[0m \u001b[38;5;66;03m# it will also try to release it and we'll have a double-release\u001b[39;00m\n\u001b[1;32m 728\u001b[0m \u001b[38;5;66;03m# mess.\u001b[39;00m\n\u001b[1;32m 729\u001b[0m response_conn \u001b[38;5;241m=\u001b[39m conn \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m release_conn \u001b[38;5;28;01melse\u001b[39;00m \u001b[38;5;28;01mNone\u001b[39;00m\n", - "File \u001b[0;32m~/Documents/work_repos/orchestrator-sdk/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:467\u001b[0m, in \u001b[0;36mHTTPConnectionPool._make_request\u001b[0;34m(self, conn, method, url, timeout, chunked, **httplib_request_kw)\u001b[0m\n\u001b[1;32m 462\u001b[0m httplib_response \u001b[38;5;241m=\u001b[39m conn\u001b[38;5;241m.\u001b[39mgetresponse()\n\u001b[1;32m 463\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mBaseException\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m e:\n\u001b[1;32m 464\u001b[0m \u001b[38;5;66;03m# Remove the TypeError from the exception chain in\u001b[39;00m\n\u001b[1;32m 465\u001b[0m \u001b[38;5;66;03m# Python 3 (including for exceptions like SystemExit).\u001b[39;00m\n\u001b[1;32m 466\u001b[0m \u001b[38;5;66;03m# Otherwise it looks like a bug in the code.\u001b[39;00m\n\u001b[0;32m--> 467\u001b[0m \u001b[43msix\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mraise_from\u001b[49m\u001b[43m(\u001b[49m\u001b[43me\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;28;43;01mNone\u001b[39;49;00m\u001b[43m)\u001b[49m\n\u001b[1;32m 468\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m (SocketTimeout, BaseSSLError, SocketError) \u001b[38;5;28;01mas\u001b[39;00m e:\n\u001b[1;32m 469\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_raise_timeout(err\u001b[38;5;241m=\u001b[39me, url\u001b[38;5;241m=\u001b[39murl, timeout_value\u001b[38;5;241m=\u001b[39mread_timeout)\n", - "File \u001b[0;32m:3\u001b[0m, in \u001b[0;36mraise_from\u001b[0;34m(value, from_value)\u001b[0m\n", - "File \u001b[0;32m~/Documents/work_repos/orchestrator-sdk/venv/lib/python3.10/site-packages/urllib3/connectionpool.py:462\u001b[0m, in \u001b[0;36mHTTPConnectionPool._make_request\u001b[0;34m(self, conn, method, url, timeout, chunked, **httplib_request_kw)\u001b[0m\n\u001b[1;32m 459\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mTypeError\u001b[39;00m:\n\u001b[1;32m 460\u001b[0m \u001b[38;5;66;03m# Python 3\u001b[39;00m\n\u001b[1;32m 461\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m--> 462\u001b[0m httplib_response \u001b[38;5;241m=\u001b[39m \u001b[43mconn\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mgetresponse\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 463\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mBaseException\u001b[39;00m \u001b[38;5;28;01mas\u001b[39;00m e:\n\u001b[1;32m 464\u001b[0m \u001b[38;5;66;03m# Remove the TypeError from the exception chain in\u001b[39;00m\n\u001b[1;32m 465\u001b[0m \u001b[38;5;66;03m# Python 3 (including for exceptions like SystemExit).\u001b[39;00m\n\u001b[1;32m 466\u001b[0m \u001b[38;5;66;03m# Otherwise it looks like a bug in the code.\u001b[39;00m\n\u001b[1;32m 467\u001b[0m six\u001b[38;5;241m.\u001b[39mraise_from(e, \u001b[38;5;28;01mNone\u001b[39;00m)\n", - "File \u001b[0;32m/opt/homebrew/Cellar/python@3.10/3.10.14/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py:1375\u001b[0m, in \u001b[0;36mHTTPConnection.getresponse\u001b[0;34m(self)\u001b[0m\n\u001b[1;32m 1373\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m 1374\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m-> 1375\u001b[0m \u001b[43mresponse\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mbegin\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 1376\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m \u001b[38;5;167;01mConnectionError\u001b[39;00m:\n\u001b[1;32m 1377\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39mclose()\n", - "File \u001b[0;32m/opt/homebrew/Cellar/python@3.10/3.10.14/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py:318\u001b[0m, in \u001b[0;36mHTTPResponse.begin\u001b[0;34m(self)\u001b[0m\n\u001b[1;32m 316\u001b[0m \u001b[38;5;66;03m# read until we get a non-100 response\u001b[39;00m\n\u001b[1;32m 317\u001b[0m \u001b[38;5;28;01mwhile\u001b[39;00m \u001b[38;5;28;01mTrue\u001b[39;00m:\n\u001b[0;32m--> 318\u001b[0m version, status, reason \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_read_status\u001b[49m\u001b[43m(\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 319\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m status \u001b[38;5;241m!=\u001b[39m CONTINUE:\n\u001b[1;32m 320\u001b[0m \u001b[38;5;28;01mbreak\u001b[39;00m\n", - "File \u001b[0;32m/opt/homebrew/Cellar/python@3.10/3.10.14/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py:279\u001b[0m, in \u001b[0;36mHTTPResponse._read_status\u001b[0;34m(self)\u001b[0m\n\u001b[1;32m 278\u001b[0m \u001b[38;5;28;01mdef\u001b[39;00m \u001b[38;5;21m_read_status\u001b[39m(\u001b[38;5;28mself\u001b[39m):\n\u001b[0;32m--> 279\u001b[0m line \u001b[38;5;241m=\u001b[39m \u001b[38;5;28mstr\u001b[39m(\u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mfp\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mreadline\u001b[49m\u001b[43m(\u001b[49m\u001b[43m_MAXLINE\u001b[49m\u001b[43m \u001b[49m\u001b[38;5;241;43m+\u001b[39;49m\u001b[43m \u001b[49m\u001b[38;5;241;43m1\u001b[39;49m\u001b[43m)\u001b[49m, \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124miso-8859-1\u001b[39m\u001b[38;5;124m\"\u001b[39m)\n\u001b[1;32m 280\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m \u001b[38;5;28mlen\u001b[39m(line) \u001b[38;5;241m>\u001b[39m _MAXLINE:\n\u001b[1;32m 281\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m LineTooLong(\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mstatus line\u001b[39m\u001b[38;5;124m\"\u001b[39m)\n", - "File \u001b[0;32m/opt/homebrew/Cellar/python@3.10/3.10.14/Frameworks/Python.framework/Versions/3.10/lib/python3.10/socket.py:705\u001b[0m, in \u001b[0;36mSocketIO.readinto\u001b[0;34m(self, b)\u001b[0m\n\u001b[1;32m 703\u001b[0m \u001b[38;5;28;01mwhile\u001b[39;00m \u001b[38;5;28;01mTrue\u001b[39;00m:\n\u001b[1;32m 704\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[0;32m--> 705\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_sock\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mrecv_into\u001b[49m\u001b[43m(\u001b[49m\u001b[43mb\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 706\u001b[0m \u001b[38;5;28;01mexcept\u001b[39;00m timeout:\n\u001b[1;32m 707\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_timeout_occurred \u001b[38;5;241m=\u001b[39m \u001b[38;5;28;01mTrue\u001b[39;00m\n", - "File \u001b[0;32m/opt/homebrew/Cellar/python@3.10/3.10.14/Frameworks/Python.framework/Versions/3.10/lib/python3.10/ssl.py:1307\u001b[0m, in \u001b[0;36mSSLSocket.recv_into\u001b[0;34m(self, buffer, nbytes, flags)\u001b[0m\n\u001b[1;32m 1303\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m flags \u001b[38;5;241m!=\u001b[39m \u001b[38;5;241m0\u001b[39m:\n\u001b[1;32m 1304\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;167;01mValueError\u001b[39;00m(\n\u001b[1;32m 1305\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mnon-zero flags not allowed in calls to recv_into() on \u001b[39m\u001b[38;5;132;01m%s\u001b[39;00m\u001b[38;5;124m\"\u001b[39m \u001b[38;5;241m%\u001b[39m\n\u001b[1;32m 1306\u001b[0m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m\u001b[38;5;18m__class__\u001b[39m)\n\u001b[0;32m-> 1307\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mread\u001b[49m\u001b[43m(\u001b[49m\u001b[43mnbytes\u001b[49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mbuffer\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 1308\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[1;32m 1309\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28msuper\u001b[39m()\u001b[38;5;241m.\u001b[39mrecv_into(buffer, nbytes, flags)\n", - "File \u001b[0;32m/opt/homebrew/Cellar/python@3.10/3.10.14/Frameworks/Python.framework/Versions/3.10/lib/python3.10/ssl.py:1163\u001b[0m, in \u001b[0;36mSSLSocket.read\u001b[0;34m(self, len, buffer)\u001b[0m\n\u001b[1;32m 1161\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m 1162\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m buffer \u001b[38;5;129;01mis\u001b[39;00m \u001b[38;5;129;01mnot\u001b[39;00m \u001b[38;5;28;01mNone\u001b[39;00m:\n\u001b[0;32m-> 1163\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28;43mself\u001b[39;49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43m_sslobj\u001b[49m\u001b[38;5;241;43m.\u001b[39;49m\u001b[43mread\u001b[49m\u001b[43m(\u001b[49m\u001b[38;5;28;43mlen\u001b[39;49m\u001b[43m,\u001b[49m\u001b[43m \u001b[49m\u001b[43mbuffer\u001b[49m\u001b[43m)\u001b[49m\n\u001b[1;32m 1164\u001b[0m \u001b[38;5;28;01melse\u001b[39;00m:\n\u001b[1;32m 1165\u001b[0m \u001b[38;5;28;01mreturn\u001b[39;00m \u001b[38;5;28mself\u001b[39m\u001b[38;5;241m.\u001b[39m_sslobj\u001b[38;5;241m.\u001b[39mread(\u001b[38;5;28mlen\u001b[39m)\n", - "\u001b[0;31mKeyboardInterrupt\u001b[0m: " + "name": "stdout", + "output_type": "stream", + "text": [ + "{\n", + " \"data\": {\n", + " \"task\": {\n", + " \"created_at\": \"1714254304\",\n", + " \"end_at\": \"1714257898\",\n", + " \"leading_job_id\": null,\n", + " \"refund_amount\": null,\n", + " \"status\": \"initialized\",\n", + " \"task_detail_cid\": \"https://data.mcs.lagrangedao.org/ipfs/QmXLSaBqtoWZWAUoiYxM3EDxh14kkhpUiYkVjZSK3BhfKj\",\n", + " \"tx_hash\": null,\n", + " \"updated_at\": \"1714254304\",\n", + " \"uuid\": \"f4799212-4dc2-4c0b-9209-c0ac7bc48442\"\n", + " }\n", + " },\n", + " \"message\": \"Task_uuid initialized.\",\n", + " \"status\": \"success\"\n", + "}\n" ] } ], @@ -522,14 +328,14 @@ }, { "cell_type": "code", - "execution_count": null, + "execution_count": 21, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ - "0xca3e3a2cf9eaf8718516d0fae9e45cd693e9fe3ac31955433a55ae77bd86fad7\n" + "0x6aa4b358a4febbc00f4cc44d24ff3a02244f888aa1e569913f8f5e9d541cbb2d\n" ] } ], @@ -549,20 +355,30 @@ }, { "cell_type": "code", - "execution_count": null, + "execution_count": 22, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ - "{'tx_hash': '0xca3e3a2cf9eaf8718516d0fae9e45cd693e9fe3ac31955433a55ae77bd86fad7', 'task_uuid': 'c5da85dd-7c45-4725-ab23-68948698aa7d'}\n", + "{'tx_hash': '0x6aa4b358a4febbc00f4cc44d24ff3a02244f888aa1e569913f8f5e9d541cbb2d', 'task_uuid': 'f4799212-4dc2-4c0b-9209-c0ac7bc48442'}\n", "{\n", " \"data\": {\n", - " \"error_code\": 1131\n", + " \"task\": {\n", + " \"created_at\": \"1714254304\",\n", + " \"end_at\": \"1714257898\",\n", + " \"leading_job_id\": null,\n", + " \"refund_amount\": null,\n", + " \"status\": \"created\",\n", + " \"task_detail_cid\": \"https://data.mcs.lagrangedao.org/ipfs/QmXLSaBqtoWZWAUoiYxM3EDxh14kkhpUiYkVjZSK3BhfKj\",\n", + " \"tx_hash\": null,\n", + " \"updated_at\": \"1714254312\",\n", + " \"uuid\": \"f4799212-4dc2-4c0b-9209-c0ac7bc48442\"\n", + " }\n", " },\n", - " \"message\": \"payment validation failed: payment receipt contract address is not correct\",\n", - " \"status\": \"failed\"\n", + " \"message\": \"Task payment validated successfully.\",\n", + " \"status\": \"success\"\n", "}\n" ] } @@ -587,7 +403,7 @@ }, { "cell_type": "code", - "execution_count": null, + "execution_count": 23, "metadata": {}, "outputs": [ { @@ -599,18 +415,18 @@ " \"computing_providers\": [],\n", " \"jobs\": [],\n", " \"task\": {\n", - " \"created_at\": \"1714249086\",\n", - " \"end_at\": \"1714252680\",\n", + " \"created_at\": \"1714254304\",\n", + " \"end_at\": \"1714257898\",\n", " \"leading_job_id\": null,\n", " \"refund_amount\": null,\n", " \"status\": \"created\",\n", - " \"task_detail_cid\": \"https://plutotest.acl.swanipfs.com/ipfs/QmRhBxuNFxMmWreEQfsB1WKXjpoK9SP5yKAK4DiqE4ihKx\",\n", + " \"task_detail_cid\": \"https://data.mcs.lagrangedao.org/ipfs/QmXLSaBqtoWZWAUoiYxM3EDxh14kkhpUiYkVjZSK3BhfKj\",\n", " \"tx_hash\": null,\n", - " \"updated_at\": \"1714249094\",\n", - " \"uuid\": \"ec5d4242-0ebe-48b4-a4a1-96f5080ba064\"\n", + " \"updated_at\": \"1714254312\",\n", + " \"uuid\": \"f4799212-4dc2-4c0b-9209-c0ac7bc48442\"\n", " }\n", " },\n", - " \"message\": \"fetch task info for task_uuid='ec5d4242-0ebe-48b4-a4a1-96f5080ba064' successfully\",\n", + " \"message\": \"fetch task info for task_uuid='f4799212-4dc2-4c0b-9209-c0ac7bc48442' successfully\",\n", " \"status\": \"success\"\n", "}\n" ] @@ -634,14 +450,14 @@ }, { "cell_type": "code", - "execution_count": null, + "execution_count": 28, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ - "['https://077dupt8wa.cp.filezoo.com.cn']\n" + "['https://1aql2r340t.computing.nebulablock.com', 'https://e958tag7ox.lag.nebulablock.com', 'https://hygltuf3x5.computing.storefrontiers.cn']\n" ] } ], @@ -652,7 +468,7 @@ }, { "cell_type": "code", - "execution_count": null, + "execution_count": 25, "metadata": {}, "outputs": [ { @@ -662,7 +478,7 @@ "traceback": [ "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", "\u001b[0;31mIndexError\u001b[0m Traceback (most recent call last)", - "Cell \u001b[0;32mIn[52], line 8\u001b[0m\n\u001b[1;32m 2\u001b[0m \u001b[38;5;28;01mimport\u001b[39;00m \u001b[38;5;21;01mjson\u001b[39;00m\n\u001b[1;32m 4\u001b[0m headers \u001b[38;5;241m=\u001b[39m {\n\u001b[1;32m 5\u001b[0m \u001b[38;5;124m'\u001b[39m\u001b[38;5;124mContent-Type\u001b[39m\u001b[38;5;124m'\u001b[39m: \u001b[38;5;124m'\u001b[39m\u001b[38;5;124mapplication/json\u001b[39m\u001b[38;5;124m'\u001b[39m,\n\u001b[1;32m 6\u001b[0m }\n\u001b[0;32m----> 8\u001b[0m response \u001b[38;5;241m=\u001b[39m requests\u001b[38;5;241m.\u001b[39mget(\u001b[43mr\u001b[49m\u001b[43m[\u001b[49m\u001b[38;5;241;43m0\u001b[39;49m\u001b[43m]\u001b[49m, headers\u001b[38;5;241m=\u001b[39mheaders)\n\u001b[1;32m 10\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m 11\u001b[0m \u001b[38;5;28mprint\u001b[39m(json\u001b[38;5;241m.\u001b[39mdumps(response\u001b[38;5;241m.\u001b[39mjson(), indent\u001b[38;5;241m=\u001b[39m\u001b[38;5;241m4\u001b[39m))\n", + "Cell \u001b[0;32mIn[25], line 8\u001b[0m\n\u001b[1;32m 2\u001b[0m \u001b[38;5;28;01mimport\u001b[39;00m \u001b[38;5;21;01mjson\u001b[39;00m\n\u001b[1;32m 4\u001b[0m headers \u001b[38;5;241m=\u001b[39m {\n\u001b[1;32m 5\u001b[0m \u001b[38;5;124m'\u001b[39m\u001b[38;5;124mContent-Type\u001b[39m\u001b[38;5;124m'\u001b[39m: \u001b[38;5;124m'\u001b[39m\u001b[38;5;124mapplication/json\u001b[39m\u001b[38;5;124m'\u001b[39m,\n\u001b[1;32m 6\u001b[0m }\n\u001b[0;32m----> 8\u001b[0m response \u001b[38;5;241m=\u001b[39m requests\u001b[38;5;241m.\u001b[39mget(\u001b[43mr\u001b[49m\u001b[43m[\u001b[49m\u001b[38;5;241;43m0\u001b[39;49m\u001b[43m]\u001b[49m, headers\u001b[38;5;241m=\u001b[39mheaders)\n\u001b[1;32m 10\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m 11\u001b[0m \u001b[38;5;28mprint\u001b[39m(json\u001b[38;5;241m.\u001b[39mdumps(response\u001b[38;5;241m.\u001b[39mjson(), indent\u001b[38;5;241m=\u001b[39m\u001b[38;5;241m4\u001b[39m))\n", "\u001b[0;31mIndexError\u001b[0m: list index out of range" ] } diff --git a/swan/__init__.py b/swan/__init__.py index c305ed7e..ac490cce 100644 --- a/swan/__init__.py +++ b/swan/__init__.py @@ -1,6 +1,5 @@ # ./swan/__init__.py from swan.api.swan_api import SwanAPI -from swan.api.mcs_api import MCSAPI from swan.api_client import APIClient from swan.contract.swan_contract import SwanContract diff --git a/swan/api/mcs_api.py b/swan/api/mcs_api.py deleted file mode 100644 index 64eaae4c..00000000 --- a/swan/api/mcs_api.py +++ /dev/null @@ -1,519 +0,0 @@ -import logging -from hashlib import md5 -from queue import Queue -import threading -import urllib.request -import logging -import glob -import tarfile -import os -import json -import requests -import traceback -from contextlib import closing -from swan.common.utils import object_to_filename - -from swan.common.constant import * -from swan.api_client import APIClient - -class MCSAPI(APIClient): - - def __init__(self, api_key, access_token=None, MCS_API=None, login=True): - self.token = None - self.api_key = api_key - self.access_token = access_token - self.MCS_API = MCS_API if MCS_API else MCS_POLYGON_MAIN_API - self.gateway = None - if login: - self.api_key_login() - self.gateway = self.get_gateway() - - def api_key_login(self): - params = {'apikey': self.api_key} - try: - result = self._request_with_params( - POST, "/api/v2/user/login_by_api_key", self.MCS_API, params, None, None, json_body=True) - self.token = result['data'] - logging.info("\033[32mLogin successful\033[0m") - return self.token - except Exception as e: - logging.error("\033[31m Please check your APIkey.\033[0m") - logging.error(str(e) + "\n" + traceback.format_exc()) - logging.error(result) - return None - - def list_buckets(self): - try: - result = self._request_without_params( - GET, BUCKET_LIST, self.MCS_API, self.token) - bucket_info_list = [] - data = result['data'] - if data: - for bucket in data: - bucket_info: Bucket = Bucket(bucket) - bucket_info_list.append(bucket_info) - return bucket_info_list - except: - logging.error("An error occurred while executing list_buckets()") - return None - - def create_bucket(self, bucket_name): - params = {'bucket_name': bucket_name} - try: - result = self._request_with_params( - POST, CREATE_BUCKET, self.MCS_API, params, self.token, None, json_body=True) - if result['status'] == 'success': - logging.info("\033[32mBucket created successfully\033[0m") - return True - else: - logging.error("\033[31m" + result['message'] + "\033[0m") - except: - logging.error("\033[31mThis bucket already exists\033[0m") - - return False - - def delete_bucket(self, bucket_name): - try: - bucket_id = self._get_bucket_id(bucket_name) - params = {'bucket_uid': bucket_id} - result = self._request_with_params( - GET, DELETE_BUCKET, self.MCS_API, params, self.token, None, json_body=True) - if result['status'] == 'success': - logging.info("\033[32mBucket delete successfully\033[0m") - return True - except: - logging.error("\033[31mCan't find this bucket\033[0m") - return False - - def get_bucket(self, bucket_name='', bucket_id=''): - bucketlist = self.list_buckets() - if bucketlist: - if bucket_id != '' and bucket_name != '': - for bucket in bucketlist: - if bucket.bucket_name == bucket_name and bucket.bucket_uid == bucket_id: - return bucket - if bucket_name != '' and bucket_id == '': - for bucket in bucketlist: - if bucket.bucket_name == bucket_name: - return bucket - if bucket_name == '' and bucket_id != '': - for bucket in bucketlist: - if bucket.bucket_uid == bucket_id: - return bucket - logging.error("\033[31mUser does not have this bucket\033[0m") - return None - - # object name - def get_file(self, bucket_name, object_name): - try: - bucket_id = self._get_bucket_id(bucket_name) - params = {"bucket_uid": bucket_id, "object_name": object_name} - - result = self._request_with_params( - GET, GET_FILE, self.MCS_API, params, self.token, None, json_body=True) - - if result: - return File(result['data'], self.gateway) - except Exception as e: - logging.error("\033[31mCannot get file\033[0m") - logging.error(f'{str(e)} \n {traceback.format_exc()}') - logging.error(result) - return - - def create_folder(self, bucket_name, folder_name, prefix=''): - if not folder_name: - logging.error("\033[31mFolder name cannot be empty") - return False - try: - bucket_id = self._get_bucket_id(bucket_name) - params = {"file_name": folder_name, - "prefix": prefix, "bucket_uid": bucket_id} - result = self._request_with_params( - POST, CREATE_FOLDER, self.MCS_API, params, self.token, None, json_body=True) - if result['status'] == 'success': - logging.info("\033[31mFolder created successfully\033[0m") - return True - else: - logging.error("\033[31m" + result['message'] + "\033[0m") - return False - except: - logging.error("\033[31mCan't create this folder") - return False - - def delete_file(self, bucket_name, object_name): - try: - prefix, file_name = object_to_filename(object_name) - file_list = self._get_full_file_list(bucket_name, prefix) - if file_list is None: - return False - - file_id = '' - for file in file_list: - if file.name == file_name: - file_id = file.id - params = {'file_id': file_id} - if file_id == '': - logging.error("\033[31mCan't find the file\033[0m") - return False - result = self._request_with_params( - GET, DELETE_FILE, self.MCS_API, params, self.token, None, json_body=True) - if result['status'] == 'success': - logging.info("\033[32mFile delete successfully\033[0m") - return True - else: - logging.error("\033[31mCan't delete the file\033[0m") - return False - except: - logging.error("\033[31mCan't find this bucket\033[0m") - return False - - def list_files(self, bucket_name, prefix='', limit=10, offset=0): - - if type(limit) is not int or type(offset) is not int: - logging.error("\033[31mInvalid parameters\033[0m") - return None - - try: - bucket_id = self._get_bucket_id(bucket_name) - if bucket_id == '': - logging.error("\033[31mCan't find this bucket\033[0m") - return None - except Exception as e: - logging.error("\033[31mCan't find this bucket\033[0m") - logging.error(f"{str(e)} \n {traceback.format_exc()}") - return None - - try: - params = {'bucket_uid': bucket_id, 'prefix': prefix, - 'limit': limit, 'offset': offset} - result = self._request_with_params( - GET, FILE_LIST, self.MCS_API, params, self.token, None , json_body=True) - if result['status'] == 'success': - files = result['data']['file_list'] - file_list = [] - for file in files: - file_info: File = File(file, self.gateway) - file_list.append(file_info) - return file_list - except Exception as e: - logging.error("\033[31mCan't list files\033[0m") - logging.error(f"{str(e)} \n {traceback.format_exc()}") - logging.error(result) - return None - - def upload_file(self, bucket_name, object_name, file_path, replace=False): - try: - prefix, file_name = object_to_filename(object_name) - bucket_id = self._get_bucket_id(bucket_name) - if bucket_id is None: - logging.error("\033[31mCan't find this bucket\033[0m") - return None - if not file_name: - logging.error("\033[31mFile name cannot be empty") - return None - - # if os.stat(file_path).st_size == 0: - # logging.error("\033[31mFile size cannot be 0\033[0m") - # return None - - file_size = os.stat(file_path).st_size - with open(file_path, 'rb') as file: - file_hash = md5(file.read()).hexdigest() - result = self._check_file(bucket_id, file_hash, file_name, prefix) - - # Replace file if already existed - if result['data']['file_is_exist'] and replace: - self.delete_file(bucket_name, object_name) - result = self._check_file( - bucket_id, file_hash, file_name, prefix) - if not (result['data']['file_is_exist']): - if not (result['data']['ipfs_is_exist']): - with open(file_path, 'rb') as file: - i = 0 - queue = Queue() - self.upload_progress_bar( - file_name, file_size) - for chunk in self._read_chunks(file): - i += 1 - queue.put((str(i), chunk)) - file.close() - threads = list() - for i in range(3): - worker = threading.Thread( - target=self._thread_upload_chunk, args=(queue, file_hash, file_name)) - threads.append(worker) - worker.start() - for thread in threads: - thread.join() - result = self._merge_file( - bucket_id, file_hash, file_name, prefix) - if result is None: - logging.error("\033[31m Merge file failed\033[0m") - return None - file_id = result['data']['file_id'] - file_info = self._get_file_info(file_id) - if file_info is None: - logging.error("\033[31m Get file info failed\033[0m") - return None - self._create_folders(bucket_name, prefix) - logging.info("\033[32mFile upload successfully\033[0m") - return file_info - logging.error("\033[31mFile already exists\033[0m") - return None - except Exception as e: - logging.error("\033[31mError while uploading file\033[0m") - logging.error(f"{str(e)} \n {traceback.format_exc()}") - return None - - def _create_folders(self, bucket_name, path): - bucket_id = self._get_bucket_id(bucket_name) - if bucket_id: - path, folder_name = object_to_filename(path) - while folder_name: - params = {"file_name": folder_name, - "prefix": path, "bucket_uid": bucket_id} - self._request_with_params( - POST, CREATE_FOLDER, self.MCS_API, params, self.token, None, json_body=True) - path, folder_name = object_to_filename(path) - return True - else: - logging.error("\033[31mBucket not found\033[0m") - return False - - def _upload_to_bucket(self, bucket_name, object_name, file_path): - if os.path.isdir(file_path): - return self.upload_folder(bucket_name, object_name, file_path) - else: - return self.upload_file(bucket_name, object_name, file_path) - - def upload_folder(self, bucket_name, object_name, folder_path): - prefix, folder_name = object_to_filename(object_name) - folder_res = self.create_folder(bucket_name, folder_name, prefix) - if folder_res is True: - res = [] - files = os.listdir(folder_path) - for f in files: - f_path = os.path.join(folder_path, f) - upload = self._upload_to_bucket( - bucket_name, os.path.join(object_name, f), f_path) - res.append(upload) - - self._create_folders(bucket_name, prefix) - return res - return None - - def upload_ipfs_folder(self, bucket_name, object_name, folder_path): - # folder_name = os.path.basename(object_name) or os.path.basename(folder_path) - # prefix = os.path.normpath(os.path.dirname(object_name)) if os.path.dirname(object_name) else '' - prefix, folder_name = object_to_filename(object_name) - - if not folder_name: - logging.error("\033[31mFolder name cannot be empty") - return False - - bucket_uid = self._get_bucket_id(bucket_name) - if bucket_uid: - files = self._read_files(folder_path, folder_name) - form_data = {"folder_name": folder_name, - "prefix": prefix, "bucket_uid": bucket_uid} - res = self._request_with_params( - POST, PIN_IPFS, self.MCS_API, form_data, self.token, files, json_body=True) - if res and res["data"]: - self._create_folders(bucket_name, prefix) - folder = (File(res["data"], self.gateway)) - return folder - else: - logging.error("\033[31mIPFS Folder Upload Error\033[0m") - return None - else: - logging.error("\033[31mBucket not found\033[0m") - return None - - def download_file(self, bucket_name, object_name, local_filename): - try: - file = self.get_file(bucket_name, object_name) - except: - logging.error('\033[31mFile does not exist\033[0m') - return False - - try: - ipfs_url = file.ipfs_url - with open(local_filename, 'wb') as f: - if file.size > 0: - data = urllib.request.urlopen(ipfs_url) - if data: - f.write(data.read()) - logging.info( - "\033[32mFile downloaded successfully\033[0m") - return True - else: - logging.error('\033[31mDownload failed\033[0m') - return False - except: - logging.error('\033[31mDownload failed\033[0m') - return False - - def download_ipfs_folder(self, bucket_name, object_name, folder_path): - folder = self.get_file(bucket_name, object_name) - if folder is None: - logging.error('\033[31mFolder does not exist\033[0m') - return False - dir_name, folder_name = os.path.split(folder_path) - download_url = folder.gateway + "/api/v0/get?arg=" + \ - folder.payloadCid + "&create=true" - - if os.path.exists(folder_path): - logging.error('\033[31mFolder already exists\033[0m') - return False - try: - with closing(requests.post(download_url, stream=True)) as resp: - if resp.status_code != 200: - logging.error('\033[31mFile download failed\033[0m') - return False - with tarfile.open(fileobj=resp.raw, mode="r|*") as tar: - tar.extractall(path=dir_name) - first_name = next(iter(tar), None).name - if dir_name != "": - first_name = dir_name + "/" + first_name - os.rename(first_name, folder_path) - return True - except Exception: - logging.error('\033[31mFile download failed\033[0m') - return False - - def _check_file(self, bucket_id, file_hash, file_name, prefix=''): - params = {'bucket_uid': bucket_id, 'file_hash': file_hash, - 'file_name': file_name, 'prefix': prefix} - return self._request_with_params(POST, CHECK_UPLOAD, self.MCS_API, params, self.token, None, json_body=True) - - def _upload_chunk(self, file_hash, file_name, chunk): - params = {'hash': file_hash, 'file': (file_name, chunk)} - return self._request_bucket_upload(UPLOAD_CHUNK, self.MCS_API, params, self.token) - - def _thread_upload_chunk(self, queue, file_hash, file_name): - while not queue.empty(): - chunk = queue.get() - self._upload_chunk(file_hash, chunk[0] + '_' + file_name, chunk[1]) - - def _merge_file(self, bucket_id, file_hash, file_name, prefix=''): - params = {'bucket_uid': bucket_id, 'file_hash': file_hash, - 'file_name': file_name, 'prefix': prefix} - return self._request_with_params(POST, MERGE_FILE, self.MCS_API, params, self.token, None, json_body=True) - - def _read_chunks(self, file, chunk_size=10485760): - while True: - data = file.read(chunk_size) - if not data: - break - yield data - - def _get_bucket_id(self, bucket_name): - bucketlist = self.list_buckets() - if bucketlist: - for bucket in bucketlist: - if bucket.bucket_name == str(bucket_name): - return bucket.bucket_uid - return None - - def _get_full_file_list(self, bucket_name, prefix=''): - bucket_id = self._get_bucket_id(bucket_name) - if bucket_id is None: - return None - params = {'bucket_uid': bucket_id, - 'prefix': prefix, 'limit': 10, 'offset': 0} - count = self._request_with_params(GET, FILE_LIST, self.MCS_API, params, self.token, None, json_body=True)['data'][ - 'count'] - file_list = [] - for i in range(count // 10 + 1): - params['offset'] = i * 10 - result = \ - self._request_with_params(GET, FILE_LIST, self.MCS_API, params, self.token, None, json_body=True)['data'][ - 'file_list'] - for file in result: - file_info: File = File(file, self.gateway) - file_list.append(file_info) - return file_list - - def _get_file_info(self, file_id): - params = {'file_id': file_id} - result = self._request_with_params( - GET, FILE_INFO, self.MCS_API, params, self.token, None, json_body=True) - file_info = File(result['data'], self.gateway) - return file_info - - def get_gateway(self): - try: - result = self._request_without_params( - GET, GET_GATEWAY, self.MCS_API, self.token) - if result is None: - return - data = result['data'] - return 'https://' + data[0] - except: - logging.error("\033[31m" "Get Gateway failed" "\033[0m") - return - - def _read_files(self, root_folder, folder_name): - # Create an empty list to store the file tuples - file_dict = [] - - # Use glob to retrieve the file paths in the directory and its subdirectories - file_paths = glob.glob(os.path.join( - root_folder, '**', '*'), recursive=True) - - # Loop through each file path and read the contents of the file - for file_path in file_paths: - if os.path.isfile(file_path): - # Get the relative path from the root folder - upload_path = folder_name + "/" + \ - os.path.relpath(file_path, root_folder) - file_dict.append(('files', ( - upload_path, open(file_path, 'rb')))) - - return file_dict - - -class Bucket: - def __init__(self, bucket_data): - self.deleted_at = bucket_data["deleted_at"] - self.updated_at = bucket_data["updated_at"] - self.created_at = bucket_data["created_at"] - self.file_number = bucket_data["file_number"] - self.bucket_name = bucket_data["bucket_name"] - self.is_deleted = bucket_data["is_deleted"] - self.is_active = bucket_data["is_active"] - self.is_free = bucket_data["is_free"] - self.size = bucket_data["size"] - self.max_size = bucket_data["max_size"] - self.address = bucket_data["address"] - self.bucket_uid = bucket_data["bucket_uid"] - - def to_json(self): - return json.dumps(self, default=lambda o: o.__dict__, - sort_keys=True, indent=4) - - -class File: - def __init__(self, file_data, gateway = 'https://ipfs.io'): - self.name = file_data["name"] - self.address = file_data["address"] - self.bucket_uid = file_data["bucket_uid"] - self.filehash = file_data["file_hash"] - self.prefix = file_data["prefix"] - self.size = file_data["size"] - self.payloadCid = file_data["payload_cid"] - self.ipfs_url = gateway + '/ipfs/' + file_data["payload_cid"] - self.pin_status = file_data["pin_status"] - self.is_deleted = file_data["is_deleted"] - self.is_folder = file_data["is_folder"] - self.id = file_data["id"] - self.updated_at = file_data["updated_at"] - self.created_at = file_data["created_at"] - self.deleted_at = file_data["deleted_at"] - self.gateway = gateway - self.object_name = file_data["object_name"] - self.type = file_data["type"] - - def to_json(self): - return json.dumps(self, default=lambda o: o.__dict__, - sort_keys=True, indent=4) \ No newline at end of file diff --git a/swan/api/swan_api.py b/swan/api/swan_api.py index bb12eae0..32b38c16 100644 --- a/swan/api/swan_api.py +++ b/swan/api/swan_api.py @@ -7,7 +7,7 @@ from swan.api_client import APIClient from swan.common.constant import * -from swan.object import HardwareConfig, LagrangeSpace +from swan.object import HardwareConfig from swan.common.exception import SwanAPIException class SwanAPI(APIClient): diff --git a/swan/api_client.py b/swan/api_client.py index 4da9a217..34370e7f 100644 --- a/swan/api_client.py +++ b/swan/api_client.py @@ -7,7 +7,7 @@ from pathlib import Path from requests_toolbelt.multipart.encoder import MultipartEncoder, MultipartEncoderMonitor -from swan.common.constant import GET, PUT, POST, DELETE, SWAN_API, APIKEY_LOGIN +from swan.common.constant import GET, PUT, POST, DELETE, SWAN_API from swan.common import utils @@ -46,8 +46,8 @@ def _request(self, method, request_path, swan_api, params, token, files=False, j return response.json() - def _request_stream_upload(self, request_path, mcs_api, params, token): - url = mcs_api + request_path + def _request_stream_upload(self, request_path, swwan_api, params, token): + url = swwan_api + request_path header = {} if token: header["Authorization"] = "Bearer " + token @@ -78,8 +78,8 @@ def _request_stream_upload(self, request_path, mcs_api, params, token): return response.json() - def _request_bucket_upload(self, request_path, mcs_api, params, token): - url = mcs_api + request_path + def _request_bucket_upload(self, request_path, swwan_api, params, token): + url = swwan_api + request_path header = {} if token: header["Authorization"] = "Bearer " + token diff --git a/swan/common/__init__.py b/swan/common/__init__.py index f0cadc54..99e601c5 100644 --- a/swan/common/__init__.py +++ b/swan/common/__init__.py @@ -1,14 +1 @@ -# ./swan/common/__init__.py - -# from swan_mcs import APIClient - -# import os - - -# def init_mcs_api(): -# return APIClient( -# os.getenv("MCS_API_KEY"), os.getenv("MCS_ACCESS_TOKEN"), "polygon.mainnet" -# ) - - -# mcs_api = init_mcs_api() +# ./swan/common/__init__.py \ No newline at end of file diff --git a/swan/common/constant.py b/swan/common/constant.py index 6196b210..d4cd5184 100644 --- a/swan/common/constant.py +++ b/swan/common/constant.py @@ -31,36 +31,4 @@ STORAGE_LAGRANGE: str = "lagrange" ORCHESTRATOR_API = "orchestrator-api.swanchain.io" MAX_DURATION = 1209600 -ORCHESTRATOR_PUBLIC_ADDRESS = "0x29eD49c8E973696D07E7927f748F6E5Eacd5516D" - -# MCS API -MCS_POLYGON_MAIN_API = "https://api.swanipfs.com" -MCS_POLYGON_MUMBAI_API = "https://calibration-mcs-api.filswan.com" -MCS_BSC_API = 'https://calibration-mcs-bsc.filswan.com' -FIL_PRICE_API = "https://api.filswan.com/stats/storage" -MCS_PARAMS = "/api/v1/common/system/params" -PRICE_RATE = "/api/v1/billing/price/filecoin" -PAYMENT_INFO = "/api/v1/billing/deal/lockpayment/info" -TASKS_DEALS = "/api/v1/storage/tasks/deals" -MINT_INFO = "/api/v1/storage/mint/info" -UPLOAD_FILE = "/api/v1/storage/ipfs/upload" -DEAL_DETAIL = "/api/v1/storage/deal/detail/" -USER_REGISTER = "/api/v1/user/register" -USER_LOGIN = "/api/v1/user/login_by_metamask_signature" -GENERATE_APIKEY = "/api/v1/user/generate_api_key" -APIKEY_LOGIN = "/api/v2/user/login_by_api_key" -COLLECTIONS = "/api/v1/storage/mint/nft_collections" -COLLECTION = "/api/v1/storage/mint/nft_collection" -CREATE_BUCKET = "/api/v2/bucket/create" -BUCKET_LIST = "/api/v2/bucket/get_bucket_list" -DELETE_BUCKET = "/api/v2/bucket/delete" -FILE_INFO = "/api/v2/oss_file/get_file_info" -DELETE_FILE = "/api/v2/oss_file/delete" -CREATE_FOLDER = "/api/v2/oss_file/create_folder" -CHECK_UPLOAD = "/api/v2/oss_file/check" -UPLOAD_CHUNK = "/api/v2/oss_file/upload" -MERGE_FILE = "/api/v2/oss_file/merge" -FILE_LIST = "/api/v2/oss_file/get_file_list" -GET_FILE = "/api/v2/oss_file/get_file_by_object_name" -GET_GATEWAY = "/api/v2/gateway/get_gateway" -PIN_IPFS = "/api/v2/oss_file/pin_files_to_ipfs" \ No newline at end of file +ORCHESTRATOR_PUBLIC_ADDRESS = "0x29eD49c8E973696D07E7927f748F6E5Eacd5516D" \ No newline at end of file diff --git a/swan/common/exception.py b/swan/common/exception.py index 5e0b388e..de1b41d8 100644 --- a/swan/common/exception.py +++ b/swan/common/exception.py @@ -15,46 +15,4 @@ class SwanRequestException(Exception): class SwanParamsException(Exception): - pass - -class McsAPIException(Exception): - - def __init__(self, response): - print(response.text + ', ' + str(response.status_code)) - self.code = 0 - try: - json_res = response.json() - except ValueError: - self.message = 'Invalid JSON error message from swan_mcs: {}'.format(response.text) - else: - if "error_code" in json_res.keys() and "error_message" in json_res.keys(): - self.code = json_res['error_code'] - self.message = json_res['error_message'] - else: - self.code = 'None' - self.message = 'System error' - - self.status_code = response.status_code - self.response = response - self.request = getattr(response, 'request', None) - - def __str__(self): # pragma: no cover - return 'API Request Error(error_code=%s): %s' % (self.code, self.message) - - -class McsRequestException(Exception): - - def __init__(self, message): - self.message = message - - def __str__(self): - return 'McsRequestException: %s' % self.message - - -class McsParamsException(Exception): - - def __init__(self, message): - self.message = message - - def __str__(self): - return 'McsParamsException: %s' % self.message \ No newline at end of file + pass \ No newline at end of file diff --git a/swan/common/utils.py b/swan/common/utils.py index c08ce40c..06b940b9 100644 --- a/swan/common/utils.py +++ b/swan/common/utils.py @@ -6,8 +6,6 @@ import datetime import time from urllib.parse import urlparse -# from swan_mcs import BucketAPI -# from swan.common import mcs_api from swan.common.file import File def parse_params_to_str(params): diff --git a/swan/object/__init__.py b/swan/object/__init__.py index b5890086..b9befca9 100644 --- a/swan/object/__init__.py +++ b/swan/object/__init__.py @@ -1,5 +1,4 @@ # ./swan/object/__init__.py from swan.object.cp_config import HardwareConfig -from swan.object.task import Task -from swan.object.source_uri import SourceFilesInfo, Repository, LagrangeSpace \ No newline at end of file +from swan.object.task import Task \ No newline at end of file diff --git a/swan/object/source_uri.py b/swan/object/source_uri.py deleted file mode 100644 index b6e46669..00000000 --- a/swan/object/source_uri.py +++ /dev/null @@ -1,369 +0,0 @@ -import json -import requests -import uuid -from swan.api.mcs_api import MCSAPI, File -from swan.common.utils import datetime_to_unixtime -from swan.object.cp_config import HardwareConfig - - -class SourceFilesInfo(): - - def __init__(self): - self.file_list = [] - self.mcs_client = None - - def mcs_connection(self, mcs_client: MCSAPI): - """Add mcs connection. - - Args: - mcs_client: mcs API object. - """ - self.mcs_client = mcs_client - - def add_folder(self, bucket_name: str, object_name: str): - """Add a folder/repo to the source file info. - - Args: - bucket_name: bucket name of bucket locating the folder. - object_name: folder object_name (directory) - """ - folder_file = self.mcs_client._get_full_file_list(bucket_name, object_name) - self.file_list += self.get_folder_files(bucket_name, folder_file) - - - def add_file(self, bucket_name: str, object_name: str): - """Add single file to source file info. - - Args: - bucket_name: bucket name of bucket locating the folder. - object_name: file object_name (directory + file name) - """ - file = self.mcs_client.get_file(bucket_name, object_name) - list.append(file) - - def _file_to_dict(self, file: File): - return { - "cid": file.payloadCid, - "created_at": datetime_to_unixtime(file.created_at), - "name": f'0x000000/spaces/{str(file.name)}', - "updated_at": datetime_to_unixtime(file.updated_at), - "url": file.ipfs_url - } - - def get_folder_files(self, bucket_name: str, file_list: list): - """Retrieve all files under folders. - - Args: - bucket_name: bucket name of bucket locating the folder. - file_llist: list of directories (cotain folders). - - Returns: - Updated file_list without folder and all files under given directories. - """ - folder_list = [] - exist_folder = False - for file in file_list: - if file.is_folder: - exist_folder = True - folder_list.append(file) - file_list.remove(file) - for folder in folder_list: - new_file_list = self.mcs_client.list_files(bucket_name, folder.object_name) - file_list += new_file_list - if exist_folder: - file_list = self.get_folder_files(bucket_name, file_list) - return file_list - - def to_dict(self): - return { - "data": { - "files": [ - self._file_to_dict(file) for file in self.file_list - ] - } - } - - def to_json(self): - return json.dumps(self, default=lambda o: o.__dict__, - sort_keys=True, indent=4) - - -class Repository(): - - def __init__(self): - """Initialize repository for swan task. - """ - self.folder_dir = None - self.bucket = None - self.path = None - self.source_uri = None - - def update_bucket_info(self, bucket_name: str, object_name: str): - """Update the current mcs bucket info on MCS. - - Args: - bucket_name: name of bucket. - object_name: directory + file_name. - """ - self.bucket = bucket_name - self.path = object_name - - def add_local_dir(self, folder_dir: str): - """Initialize repository for swan task. - - Args: - folder_dir: Directory of folder to upload to mcs. - """ - self.folder_dir = folder_dir - - def mcs_connection(self, mcs_client: MCSAPI): - """Add mcs connection. - - Args: - mcs_client: mcs API object. - """ - self.mcs_client = mcs_client - - def upload_local_to_mcs(self, bucket_name: str, obj_name: str, mcs_client: MCSAPI = None): - """Upload repository to MCS to create remote source. - - Args: - bucket_name: bucket to upload repository. - obj_name: dir + file_name for repository. - mcs_client: mcs API object. - - Returns: - API response from uploading folder to MCS. Contain info of list of files uploaded. - """ - if self.folder_dir: - if mcs_client: - self.mcs_connection(mcs_client) - self.bucket = bucket_name - self.path = obj_name - upload = mcs_client.upload_folder(bucket_name, obj_name, self.folder_dir) - return upload - return None - - def generate_source_uri(self, bucket_name: str, obj_name: str, file_path: str, mcs_client: MCSAPI = None, replace: bool = True): - """Generate source uri for task using MCS service. - - Args: - bucket_name: bucket name to store source uri. - obj_name: object name (dir + file name) to store source uri. - file_path: local file path to store source uri JSON file. - replace: replace existing file or not. - mcs_client: mcs API object. - - Returns: - API response from MCS after uploading. Contains file information. - - """ - if self.bucket and self.path: - if mcs_client: - self.mcs_connection(mcs_client) - local_source = SourceFilesInfo() - local_source.mcs_connection(mcs_client) - local_source.add_folder(self.bucket, self.path) - with open(file_path, "w") as file: - json.dump(local_source.to_dict(), file) - res = mcs_client.upload_file(bucket_name, obj_name, file_path, replace) - self.source_uri = res.ipfs_url - return res - - def to_dict(self): - return { - "folder_dir": self.folder_dir, - "bucket_name": self.bucket, - "folder_path": self.path, - "source_uri": self.source_uri - } - - def to_json(self): - return json.dumps(self, default=lambda o: o.__dict__, - sort_keys=True, indent=4) - - -class LagrangeSpace(): - - def __init__(self, space_owner: str, space_name: str, wallet_address: str, hardware_config: HardwareConfig): - - self.wallet_address = wallet_address - self.space_name = space_name - self.space_owner = space_owner - self.hardware_config = hardware_config - self.space_uuid = None - self.source_uri = None - self.file_list = [] - - def get_space_info(self): - space_file_uri = f'https://api.lagrangedao.org/spaces/{self.space_owner}/{self.space_name}/files' - space_detail_uri = f'https://api.lagrangedao.org/spaces/{self.space_owner}/{self.space_name}' - space_files = requests.get(space_file_uri).json() - space_detail = requests.get(space_detail_uri).json() - - for file in space_files["data"]: - self.file_list.append(file) - - self.space_uuid = space_detail["data"]["space"]["uuid"] - - - def mcs_connection(self, mcs_client: MCSAPI): - """Add mcs connection. - - Args: - mcs_client: mcs API object. - """ - self.mcs_client = mcs_client - - def to_dict(self): - hardware_info = self.hardware_config.description - hardware_info = hardware_info.split(" · ") - - return { - "data": { - "files": self.file_list, - "owner": { - "public_address": self.wallet_address, - }, - "space": { - "activeOrder": { - "config": { - "description": self.hardware_config.description, - "hardware": hardware_info[0], - "hardware_id": int(self.hardware_config.id), - "hardware_type": self.hardware_config.type, - "memory": [int(s) for s in hardware_info[2].split() if s.isdigit()][0], - "name": self.hardware_config.name, - "price_per_hour": float(self.hardware_config.price), - "vcpu": [int(s) for s in hardware_info[1].split() if s.isdigit()][0], - }, - }, - "name": self.space_name, - "uuid": self.space_uuid, - } - } - } - - def to_json(self): - return json.dumps(self, default=lambda o: o.__dict__, - sort_keys=True, indent=4) - - def generate_source_uri(self, bucket_name: str, obj_name: str, file_path: str, mcs_client: MCSAPI = None, replace: bool = True): - """Generate source uri for task using MCS service. - - Args: - bucket_name: bucket name to store source uri. - obj_name: object name (dir + file name) to store source uri. - file_path: local file path to store source uri JSON file. - replace: replace existing file or not. - mcs_client: mcs API object. - - Returns: - API response from MCS after uploading. Contains file information. - - """ - if mcs_client: - self.mcs_connection(mcs_client) - with open(file_path, "w") as file: - json.dump(self.to_dict(), file) - res = mcs_client.upload_file(bucket_name, obj_name, file_path, replace) - self.source_uri = res.ipfs_url - return res - -class GithubRepo(): - - def __init__(self, repo_owner: str, repo_name: str, repo_branch: str, wallet_address: str, hardware_config: HardwareConfig, repo_uri: str = None): - - if repo_uri and not (repo_owner or repo_name): - # TO DO: Add retrieve repo owner, name and branch from URI - pass - - self.repo_owner = repo_owner - self.repo_name = repo_name - self.repo_branch = repo_branch - self.wallet_address = wallet_address - self.hardware_config = hardware_config - self.repo_uuid = str(uuid.uuid4()) - self.mcs_client = None - self.source_uri = None - self.file_list = [] - - def mcs_connection(self, mcs_client: MCSAPI): - """Add mcs connection. - - Args: - mcs_client: mcs API object. - """ - self.mcs_client = mcs_client - - def get_github_tree(self): - github_tree_uri = f"https://api.github.com/repos/{self.repo_owner}/{self.repo_name}/git/trees/{self.repo_branch}?recursive=1" - - github_repo_files = requests.get(github_tree_uri).json() - - for file in github_repo_files["tree"]: - if file["type"] == "blob": - self.file_list.append( - { - "cid": file["sha"], - "created_at": None, - "name": f"{self.repo_owner}/{self.repo_name}/{self.repo_branch}/{file['path']}", - "updated_at": None, - "url": f"https://raw.githubusercontent.com/{self.repo_owner}/{self.repo_name}/{self.repo_branch}/{file['path']}" - } - ) - - def to_dict(self): - hardware_info = self.hardware_config.description - hardware_info = hardware_info.split(" · ") - - return { - "data": { - "files": self.file_list, - "owner": { - "public_address": self.wallet_address, - }, - "space": { - "activeOrder": { - "config": { - "description": self.hardware_config.description, - "hardware": hardware_info[0], - "hardware_id": int(self.hardware_config.id), - "hardware_type": self.hardware_config.type, - "memory": [int(s) for s in hardware_info[2].split() if s.isdigit()][0], - "name": self.hardware_config.name, - "price_per_hour": float(self.hardware_config.price), - "vcpu": [int(s) for s in hardware_info[1].split() if s.isdigit()][0], - }, - }, - "name": self.repo_name, - "uuid": self.repo_uuid, - } - } - } - - def to_json(self): - return json.dumps(self, default=lambda o: o.__dict__, - sort_keys=True, indent=4) - - def generate_source_uri(self, bucket_name: str, obj_name: str, file_path: str, mcs_client: MCSAPI = None, replace: bool = True): - """Generate source uri for task using MCS service. - - Args: - bucket_name: bucket name to store source uri. - obj_name: object name (dir + file name) to store source uri. - file_path: local file path to store source uri JSON file. - replace: replace existing file or not. - mcs_client: mcs API object. - - Returns: - API response from MCS after uploading. Contains file information. - - """ - if mcs_client: - self.mcs_connection(mcs_client) - with open(file_path, "w") as file: - json.dump(self.to_dict(), file) - res = mcs_client.upload_file(bucket_name, obj_name, file_path, replace) - self.source_uri = res.ipfs_url - return res \ No newline at end of file diff --git a/test/dev_test.ipynb b/test/dev_test.ipynb deleted file mode 100644 index 13394a40..00000000 --- a/test/dev_test.ipynb +++ /dev/null @@ -1,557 +0,0 @@ -{ - "cells": [ - { - "cell_type": "code", - "execution_count": 1, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "gmP5D6yRr8\n", - "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJmcmVzaCI6ZmFsc2UsImlhdCI6MTcxMzU2MDY2NiwianRpIjoiOThiOWIzYzktNjFlYy00OGE0LWE0OTYtYmVjNWI4YTU2NjBkIiwidHlwZSI6ImFjY2VzcyIsInN1YiI6IjB4NjFjM2UwM2RiZWQ1NWY1REUyMTM3MzJlODE2RjhBOEZkNkU5YmZGMCIsIm5iZiI6MTcxMzU2MDY2NiwiY3NyZiI6IjdkMWQ2NmViLTdmNTMtNGVmYy05ODRiLWY1NzY0OGNmNWU1NiIsImV4cCI6MTcxNjE1MjY2Nn0.uFFizTbsq6jSHK_sgl38_ZL1eqj8HLBrNqDc46hckzg\n" - ] - } - ], - "source": [ - "import sys\n", - "sys.path.insert(0, '..')\n", - "\n", - "import os\n", - "from dotenv import load_dotenv\n", - "load_dotenv()\n", - "\n", - "api_key = os.getenv('API_KEY')\n", - "print(api_key)\n", - "\n", - "# Test login \n", - "from swan import SwanAPI\n", - "\n", - "client = SwanAPI(api_key)\n", - "print(client.token)\n", - "\n", - "# r = [hardware for hardware in client.get_hardware_config()]\n", - "# print(r[1].region)\n", - "\n", - "# r = client._verify_hardware_region('Hpc1ae.2xlarge', 'North Carolina-US')\n", - "# print(r)\n", - "\n", - "# # Deploy Space\n", - "n = os.getenv('NAME')\n", - "r = os.getenv('REGION')\n", - "s = os.getenv('START')\n", - "d = os.getenv('DURATION')\n", - "u = os.getenv('SOURCE')\n", - "w = os.getenv('WALLET')\n", - "t = os.getenv('TX')\n", - "r = client.deploy_task(n, r, s, d, u, w, t, 11)\n", - "# print(r)" - ] - }, - { - "cell_type": "code", - "execution_count": 9, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "{'data': {'computing_providers': [], 'jobs': [], 'task': {'created_at': '1713560675', 'end_at': '1713597866', 'leading_job_id': None, 'refund_amount': None, 'status': 'failed', 'task_detail_cid': 'https://data.mcs.lagrangedao.org/ipfs/QmS5m628bpDsozhYgYVVk41FmrhdPDY138nk1EyN1CPGPA', 'tx_hash': None, 'updated_at': '1713560683', 'uuid': '197d8e39-518c-49f1-a757-9b0d14175566'}}, 'message': \"fetch task info for task_uuid='197d8e39-518c-49f1-a757-9b0d14175566' successfully\", 'status': 'success'}\n" - ] - } - ], - "source": [ - "print(r)" - ] - }, - { - "cell_type": "code", - "execution_count": 8, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "{'data': {'computing_providers': [], 'jobs': [], 'task': {'created_at': '1713560675', 'end_at': '1713597866', 'leading_job_id': None, 'refund_amount': None, 'status': 'failed', 'task_detail_cid': 'https://data.mcs.lagrangedao.org/ipfs/QmS5m628bpDsozhYgYVVk41FmrhdPDY138nk1EyN1CPGPA', 'tx_hash': None, 'updated_at': '1713560683', 'uuid': '197d8e39-518c-49f1-a757-9b0d14175566'}}, 'message': \"fetch task info for task_uuid='197d8e39-518c-49f1-a757-9b0d14175566' successfully\", 'status': 'success'}\n" - ] - } - ], - "source": [ - "r = client.get_deployment_info(\"\")\n", - "print(r)" - ] - }, - { - "cell_type": "code", - "execution_count": 7, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[]\n" - ] - } - ], - "source": [ - "r = client.get_real_url(\"\")\n", - "print(r)" - ] - }, - { - "cell_type": "code", - "execution_count": 19, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Hpc1ae.2xlarge\n", - "['California-US', 'North Carolina-US']\n" - ] - } - ], - "source": [ - "# r = [hardware if hardware.status == 'available' else None for hardware in client.get_hardware_config()]\n", - "# for i in range(len(r)):\n", - "# print(f'{i} {r[i].name if r[i] else None}')\n", - "# print(r[24].name)\n", - "# print(r[24].region)\n", - "\n" - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "('M1ae.3xlarge', ['Guangdong-CN'])\n" - ] - } - ], - "source": [ - "hardwares = client.get_hardware_config()\n", - "hl = [(hardware.name, hardware.region) for hardware in hardwares]\n", - "\n", - "for h in hl:\n", - " if h[0] == \"M1ae.3xlarge\":\n", - " print(h)" - ] - }, - { - "cell_type": "code", - "execution_count": 4, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "None\n" - ] - } - ], - "source": [ - "print(client.token)" - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "gmP5D6yRr8\n", - "{'data': 'eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJmcmVzaCI6ZmFsc2UsImlhdCI6MTcxMDI3NTQ0NCwianRpIjoiNjYxYmI0M2QtMTgzYi00OTVhLTk5MzYtYWMxMjY0ODUzOTdmIiwidHlwZSI6ImFjY2VzcyIsInN1YiI6IjB4NjFjM2UwM2RiZWQ1NWY1REUyMTM3MzJlODE2RjhBOEZkNkU5YmZGMCIsIm5iZiI6MTcxMDI3NTQ0NCwiY3NyZiI6ImVkYjM0YzliLTM1Y2EtNDZlMi04NDViLTg0MjlmNWExYWJhYSIsImV4cCI6MTcxMjg2NzQ0NH0.6Bii2HVeyav1DvL4K6t_2yFgyZIiZBiXFvTyfD1EDQY', 'message': 'Token created', 'status': 'success'}\n", - "{'data': {'computing_providers': [], 'jobs': [], 'task': None}, 'message': \"No task found with given task_uuid='84220a02-e963-489c-90e1-794754042253'\", 'status': 'failed'}\n" - ] - } - ], - "source": [ - "import sys\n", - "sys.path.insert(0, '..')\n", - "\n", - "import os\n", - "from dotenv import load_dotenv\n", - "load_dotenv()\n", - "\n", - "api_key = os.getenv('API_KEY')\n", - "print(api_key)\n", - "\n", - "from swan import SwanAPI\n", - "client = SwanAPI(api_key)\n", - "\n", - "# Get deployment info\n", - "print(client.get_deployment_info('84220a02-e963-489c-90e1-794754042253'))" - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "1100000000000000000000\n", - "0x12981484b991e0547804b0d5c571e3584f4202a0a2eb45d9ad526b95aa53eece\n", - "0xb08eb18de9d6b95c61b02afa02f033be46bd54843b91c46a3360c118ee838f4d\n" - ] - } - ], - "source": [ - "import sys\n", - "sys.path.insert(0, '..')\n", - "\n", - "import os\n", - "from dotenv import load_dotenv\n", - "load_dotenv()\n", - "\n", - "from swan import SwanContract\n", - "\n", - "pk = os.getenv('KEY')\n", - "rpc = os.getenv('RPC')\n", - "\n", - "c = SwanContract(pk, rpc)\n", - "\n", - "# Test esimate lock revenue\n", - "r = c.estimate_payment(13, 100)\n", - "print(r)\n", - "\n", - "# # Test get hardware info\n", - "# r = c.hardware_info(1)\n", - "# print(r)\n", - "\n", - "# # Test get swan token balance\n", - "# r = c._get_swan_balance()\n", - "# print(r * 1e-18)\n", - "\n", - "# # Test get gas\n", - "# r = c._get_swan_gas()\n", - "# print(r* 1e-18)\n", - "\n", - "r = c._approve_swan_token(2100000000000000000000)\n", - "print(r)\n", - "\n", - "# gas_estimate = c.payment_contract.functions.lockRevenue('1', 1, 1).estimate_gas()\n", - "# print(gas_estimate)\n", - "\n", - "# r = c.w3.eth.get_block('latest')['baseFeePerGas']\n", - "# print(r)\n", - "\n", - "r = c.lock_revenue('1', 13, 1)\n", - "print(r)" - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "'eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJhZGRyZXNzIjoiMHhDOTE4MDYxNkIyYjc5NzM4NUFkNjRkMDlCRjg3MzBENzRFM2I1YTQxIiwiZXhwIjoyMDI1NjM1NDUxfQ.gQx1X7nxWWSixUO99YxZO6ry-bhxSnGrL_pm0slxBJM'" - ] - }, - "execution_count": 2, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "import sys\n", - "sys.path.insert(0, '..')\n", - "\n", - "import os\n", - "from dotenv import load_dotenv\n", - "load_dotenv()\n", - "\n", - "from swan import SwanAPI\n", - "\n", - "api_key = os.getenv('API_KEY')\n", - "swan_api = SwanAPI(api_key)\n", - "\n", - "hardwares = swan_api.get_hardware_config()\n", - "price_list = [(hardware.name, hardware.price) for hardware in hardwares]\n", - "print(price_list)\n", - "\n", - "from swan import MCSAPI\n", - "\n", - "api_key = os.getenv('MCS_API_KEY')\n", - "mcs_api = MCSAPI(api_key)\n", - "mcs_api.token\n", - "\n", - "from swan.object import Repository\n", - "\n", - "repo = Repository()\n", - "\n", - "# Add local directory\n", - "repo.add_local_dir('./api')\n", - "\n", - "# Upload Directory to MCS\n", - "repo.upload_local_to_mcs('swan_test', 'mar8t2', mcs_api)\n", - "\n", - "response = repo.generate_source_uri('swan_test', 'mar8s1', './source.json', mcs_client = mcs_api)\n", - "\n", - "repo.source_uri" - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "metadata": {}, - "outputs": [ - { - "ename": "NameError", - "evalue": "name 'Repository' is not defined", - "output_type": "error", - "traceback": [ - "\u001b[1;31m---------------------------------------------------------------------------\u001b[0m", - "\u001b[1;31mNameError\u001b[0m Traceback (most recent call last)", - "Cell \u001b[1;32mIn[1], line 1\u001b[0m\n\u001b[1;32m----> 1\u001b[0m source \u001b[38;5;241m=\u001b[39m \u001b[43mRepository\u001b[49m()\n\u001b[0;32m 2\u001b[0m source\u001b[38;5;241m.\u001b[39mmcs_connection(mcs_api)\n\u001b[0;32m 4\u001b[0m \u001b[38;5;66;03m# Add MCS folder\u001b[39;00m\n", - "\u001b[1;31mNameError\u001b[0m: name 'Repository' is not defined" - ] - } - ], - "source": [ - "source = Repository()\n", - "source.mcs_connection(mcs_api)\n", - "\n", - "# Add MCS folder\n", - "source.update_bucket_info('swan_test', 'march4t4')\n", - "\n", - "# Get source URI\n", - "response = source.generate_source_uri('swan_test', 'mar1s2', './source.json', mcs_client=mcs_api)" - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "gmP5D6yRr8\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJmcmVzaCI6ZmFsc2UsImlhdCI6MTcxMTU5NjYzOSwianRpIjoiMDY0NGI3MmEtM2I5OC00ZWEyLTk2NTgtNDYzMTM3ZTI1M2Q5IiwidHlwZSI6ImFjY2VzcyIsInN1YiI6IjB4NjFjM2UwM2RiZWQ1NWY1REUyMTM3MzJlODE2RjhBOEZkNkU5YmZGMCIsIm5iZiI6MTcxMTU5NjYzOSwiY3NyZiI6Ijk0MzJmOGUzLWQxYzEtNDgwZS05ZmFiLWIyZTAzNmFhN2M5NyIsImV4cCI6MTcxNDE4ODYzOX0.ja7LPcp6uulYsRCt_BQL4ahvpayMP8PFtt3boNWF2A4\n", - "{'data': {'task': {'created_at': '1711596643', 'end_at': '1711633839', 'leading_job_id': None, 'status': 'created', 'task_detail_cid': 'https://data.mcs.lagrangedao.org/ipfs/QmaPqdVdmRJX7sos7whFhK7Da3XxzKt98UNYdata7o6zUo', 'updated_at': '1711596643', 'uuid': 'bf0093da-4489-412d-82df-b418cce5dba5'}}, 'message': 'Task_uuid created.', 'status': 'success'}\n" - ] - } - ], - "source": [ - "import sys\n", - "sys.path.insert(0, '..')\n", - "\n", - "import os\n", - "from dotenv import load_dotenv\n", - "load_dotenv()\n", - "\n", - "api_key = os.getenv('API_KEY')\n", - "print(api_key)\n", - "\n", - "# Test login \n", - "from swan import SwanAPI\n", - "\n", - "client = SwanAPI(api_key)\n", - "print(client.token)\n", - "\n", - "n = os.getenv('NAME')\n", - "r = os.getenv('REGION')\n", - "s = os.getenv('START')\n", - "d = os.getenv('DURATION')\n", - "u = os.getenv('SOURCE')\n", - "w = os.getenv('WALLET')\n", - "t = os.getenv('TX')\n", - "r = client.deploy_task(n, r, s, d, u, w, t, 10)\n", - "print(r)" - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "{'data': {'computing_providers': [{'allowed_nodes': None, 'autobid': 1, 'city': 'Richmond', 'country': None, 'created_at': '1706081835', 'deleted_at': None, 'last_active_at': None, 'lat': 37.5147, 'lon': -77.5034, 'multi_address': '/ip4/207.254.208.84/tcp/8085', 'name': 'meta.crosschain.computer', 'node_id': '049669a235dc8c2d9532903d9683ac44c02a6d2a4185efde76305f768b7cea32726f4c9b241bfc9ed6c62d4edb3694311c41ecc906c50062891d323553f0fc73f4', 'online': True, 'public_address': '0xFbc1d38a2127D81BFe3EA347bec7310a1cfa2373', 'region': 'North Carolina-US', 'score': 100, 'status': 'Active', 'updated_at': '1712154366'}, {'allowed_nodes': None, 'autobid': 1, 'city': 'Richmond', 'country': None, 'created_at': '1706081577', 'deleted_at': None, 'last_active_at': None, 'lat': 37.5147, 'lon': -77.5034, 'multi_address': '/ip4/207.254.208.84/tcp/8088', 'name': 'computing.nebulablock.com', 'node_id': '0431d0a54e995fc8bb45fdb082b7a60cf84137b67e3939942d8b3b2ef04ea407531bc9fe41d157d1251d46fa8f95eab34c3cf48e110fdac50aae352a1634a9ad40', 'online': True, 'public_address': '0xFbc1d38a2127D81BFe3EA347bec7310a1cfa2373', 'region': 'North Carolina-US', 'score': 100, 'status': 'Active', 'updated_at': '1712154359'}, {'allowed_nodes': None, 'autobid': 1, 'city': 'Richmond', 'country': None, 'created_at': '1706081318', 'deleted_at': None, 'last_active_at': None, 'lat': 37.5147, 'lon': -77.5034, 'multi_address': '/ip4/207.254.208.84/tcp/8092', 'name': 'computing.fogmeta.com', 'node_id': '0485e73d7293125d58ac2b6106c4949d6e447d128d24b5ef6fad64247f901b02b505b095b076753a85d276bce133dbf1da4c43f405ff44e9b70004d5f98df0b6c8', 'online': True, 'public_address': '0xFbc1d38a2127D81BFe3EA347bec7310a1cfa2373', 'region': 'North Carolina-US', 'score': 100, 'status': 'Active', 'updated_at': '1712154354'}], 'jobs': [{'bidder_id': '049669a235dc8c2d9532903d9683ac44c02a6d2a4185efde76305f768b7cea32726f4c9b241bfc9ed6c62d4edb3694311c41ecc906c50062891d323553f0fc73f4', 'build_log': 'wss://log.meta.crosschain.computer:8085/api/v1/computing/lagrange/spaces/log?space_id=QmPsBv8jYgosyq6x8EWkcK3659b66QvXhhEUVP5ar6mhJw&type=build', 'container_log': 'wss://log.meta.crosschain.computer:8085/api/v1/computing/lagrange/spaces/log?space_id=QmPsBv8jYgosyq6x8EWkcK3659b66QvXhhEUVP5ar6mhJw&type=container', 'created_at': '1711596659', 'duration': 36000, 'ended_at': '1711633839', 'hardware': 'C1ae.small', 'job_real_uri': 'https://6nba9kx8o3.meta.crosschain.computer', 'job_result_uri': 'https://42f6d9f62851.acl.swanipfs.com/ipfs/QmPmkxK32gsuLryKH7vXix54SaLkf6cKF3RuLdUoApdoCf', 'job_source_uri': 'https://plutotest.acl.swanipfs.com/ipfs/QmPsBv8jYgosyq6x8EWkcK3659b66QvXhhEUVP5ar6mhJw', 'name': 'Job-823a6b79-9ea3-4276-9119-8423d9de26ad', 'start_at': '1711597839', 'status': 'Cancelled', 'storage_source': 'lagrange', 'task_uuid': 'bf0093da-4489-412d-82df-b418cce5dba5', 'updated_at': '1711633902', 'uuid': '823a6b79-9ea3-4276-9119-8423d9de26ad'}, {'bidder_id': '0431d0a54e995fc8bb45fdb082b7a60cf84137b67e3939942d8b3b2ef04ea407531bc9fe41d157d1251d46fa8f95eab34c3cf48e110fdac50aae352a1634a9ad40', 'build_log': 'wss://log.computing.nebulablock.com:8088/api/v1/computing/lagrange/spaces/log?space_id=QmPsBv8jYgosyq6x8EWkcK3659b66QvXhhEUVP5ar6mhJw&type=build', 'container_log': 'wss://log.computing.nebulablock.com:8088/api/v1/computing/lagrange/spaces/log?space_id=QmPsBv8jYgosyq6x8EWkcK3659b66QvXhhEUVP5ar6mhJw&type=container', 'created_at': '1711596666', 'duration': 36000, 'ended_at': '1711633839', 'hardware': 'C1ae.small', 'job_real_uri': 'https://xgl2yunbz4.computing.nebulablock.com', 'job_result_uri': 'https://42f6d9f62851.acl.swanipfs.com/ipfs/QmQ2yumTasxNjk5WWzeZzDQLYpzcsK1o7iUvZ1kn6mtgRP', 'job_source_uri': 'https://plutotest.acl.swanipfs.com/ipfs/QmPsBv8jYgosyq6x8EWkcK3659b66QvXhhEUVP5ar6mhJw', 'name': 'Job-45a01e47-a047-4d33-8e5a-e27b4f967dc7', 'start_at': '1711597839', 'status': 'Cancelled', 'storage_source': 'lagrange', 'task_uuid': 'bf0093da-4489-412d-82df-b418cce5dba5', 'updated_at': '1711633906', 'uuid': '45a01e47-a047-4d33-8e5a-e27b4f967dc7'}, {'bidder_id': '0485e73d7293125d58ac2b6106c4949d6e447d128d24b5ef6fad64247f901b02b505b095b076753a85d276bce133dbf1da4c43f405ff44e9b70004d5f98df0b6c8', 'build_log': 'wss://log.computing.fogmeta.com:8092/api/v1/computing/lagrange/spaces/log?space_id=QmPsBv8jYgosyq6x8EWkcK3659b66QvXhhEUVP5ar6mhJw&type=build', 'container_log': 'wss://log.computing.fogmeta.com:8092/api/v1/computing/lagrange/spaces/log?space_id=QmPsBv8jYgosyq6x8EWkcK3659b66QvXhhEUVP5ar6mhJw&type=container', 'created_at': '1711596672', 'duration': 36000, 'ended_at': '1711633839', 'hardware': 'C1ae.small', 'job_real_uri': 'https://3zy5fntwyh.computing.fogmeta.com', 'job_result_uri': 'https://42f6d9f62851.acl.swanipfs.com/ipfs/QmajtnUeNZgdhpn1yZkbteaJibF3eNQQUcX23aqcNAzYeR', 'job_source_uri': 'https://plutotest.acl.swanipfs.com/ipfs/QmPsBv8jYgosyq6x8EWkcK3659b66QvXhhEUVP5ar6mhJw', 'name': 'Job-b9224b4e-be85-4230-ac49-338fd1fdcdf7', 'start_at': '1711597839', 'status': 'Cancelled', 'storage_source': 'lagrange', 'task_uuid': 'bf0093da-4489-412d-82df-b418cce5dba5', 'updated_at': '1711597890', 'uuid': 'b9224b4e-be85-4230-ac49-338fd1fdcdf7'}], 'task': {'created_at': '1711596643', 'end_at': '1711633839', 'leading_job_id': '823a6b79-9ea3-4276-9119-8423d9de26ad', 'status': 'finished', 'task_detail_cid': 'https://data.mcs.lagrangedao.org/ipfs/QmaPqdVdmRJX7sos7whFhK7Da3XxzKt98UNYdata7o6zUo', 'updated_at': '1711633906', 'uuid': 'bf0093da-4489-412d-82df-b418cce5dba5'}}, 'message': \"fetch task info for task_uuid='bf0093da-4489-412d-82df-b418cce5dba5' successfully\", 'status': 'success'}\n" - ] - } - ], - "source": [ - "import sys\n", - "sys.path.insert(0, '..')\n", - "import os\n", - "from dotenv import load_dotenv\n", - "load_dotenv()\n", - "api_key = os.getenv('API_KEY')\n", - "# Test login \n", - "from swan import SwanAPI\n", - "client = SwanAPI(api_key)\n", - "\n", - "id = os.getenv('TASK')\n", - "r = client.get_deployment_info(id)\n", - "print(r)" - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "True\n" - ] - } - ], - "source": [ - "r = client.get_deployment_info_json(id, './result.json')\n", - "print(r)" - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "['https://6nba9kx8o3.meta.crosschain.computer', 'https://xgl2yunbz4.computing.nebulablock.com', 'https://3zy5fntwyh.computing.fogmeta.com']\n" - ] - } - ], - "source": [ - "r = client.get_real_url(id)\n", - "print(r)" - ] - }, - { - "cell_type": "code", - "execution_count": 11, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "https://plutotest.acl.swanipfs.com/ipfs/QmdVrZFDRUkmAk8DuRGcfNc6WrK2kYeojVVo745Z5QSNGA\n" - ] - } - ], - "source": [ - "print(source.source_uri)" - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "[('C1ae.small', '0.0'), ('C1ae.medium', '1.0'), ('M1ae.small', '2.0'), ('M1ae.medium', '3.0'), ('M1ae.large', '4.0'), ('M1ae.xlarge', '5.0'), ('M1ae.2xlarge', '6.0'), ('M1ae.3xlarge', '6.5'), ('M2ae.small', '7.0'), ('M2ae.medium', '8.0'), ('M2ae.large', '8.5'), ('M2ae.xlarge', '9.0'), ('G1ae.small', '10.0'), ('G1ae.medium', '11.0'), ('G1ae.large', '12.0'), ('G1ae.xlarge', '13.0'), ('G2ae.small', '9.0'), ('G2ae.medium', '10.0'), ('G2ae.large', '11.0'), ('G2ae.lxarge', '12.0'), ('Hpc1ae.small', '14.0'), ('Hpc1ae.medium', '16.0'), ('Hpc1ae.large', '18.0'), ('Hpc1ae.xlarge', '20.0'), ('Hpc1ae.2xlarge', '21.0'), ('Hpc1ae.3xlarge', '21.0'), ('T1ae.small', '32.0'), ('T1ae.medium', '36.0'), ('T1ae.large', '40.0'), ('T1ae.xlarge', '42.0'), ('T1ae.2xlarge', '48.0'), ('T1ae.3xlarge', '50.0'), ('Hpc2ae.small', '22.0'), ('Hpc2ae.medium', '24.0'), ('Hpc2ae.large', '26.0'), ('Hpc2ae.xlarge', '28.0'), ('P1ae.small', '30.0'), ('P1ae.medium', '32.0'), ('P1ae.large', '40.0'), ('P1ae.xlarge', '45.0'), ('T1az.large', '52.0'), ('T1az.xlarge', '55.0'), ('T1az.2xlarge', '60.0'), ('T1az.3xlarge', '62.0'), ('T1az.4xlarge', '65.0'), ('T1az.5xlarge', '68.0'), ('T1az.6xlarge', '70.0'), ('T1az.7xlarge', '72.0'), ('T1az.8xlarge', '75.0'), ('T1az.9xlarge', '90.0'), ('T1az.10xlarge', '80.0'), ('T2az.large', '62.0'), ('T2az.xlarge', '65.0'), ('T2az.2xlarge', '70.0'), ('T2az.3xlarge', '72.0'), ('T2az.4xlarge', '75.0'), ('T2az.5xlarge', '78.0'), ('T2az.6xlarge', '80.0'), ('T2az.7xlarge', '82.0'), ('T2az.8xlarge', '85.0'), ('T2az.9xlarge', '100.0'), ('T2az.10xlarge', '90.0'), ('Hpc2ad.small', '16.0'), ('Hpc2ad.medium', '18.0'), ('Hpc2ad.large', '20.0'), ('Hpc2ad.xlarge', '21.0'), ('Hpc2ad.1xlarge', '22.0'), ('Hpc2az.small', '21.0'), ('Hpc2az.medium', '23.0'), ('Hpc2az.large', '25.0'), ('Hpc2az.xlarge', '26.0'), ('Hpc2az.1xlarge', '27.0'), ('R1ae.small', '12.0'), ('R1ae.medium', '22.0'), ('R1ae.large', '30.0'), ('R2ae.small', '35.0'), ('R2ae.medium', '38.0'), ('R2ae.large', '50.0'), ('R2ae.xlarge', '52.0'), ('R2ae.1xlarge', '54.0'), ('R2ae.2xlarge', '56.0'), ('R2ae.3xlarge', '58.0')]\n" - ] - }, - { - "name": "stderr", - "output_type": "stream", - "text": [ - "mar8s2: 0%| | 0.00/994 [00:00" - ] - }, - "execution_count": 10, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "client.upload_file('swan_test', 'mar4test/dev_test.ipynb', 'dev_test.ipynb')" - ] - }, - { - "cell_type": "code", - "execution_count": 4, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "{'data': {'files': []}}\n" - ] - } - ], - "source": [ - "import sys\n", - "sys.path.insert(0, '..')\n", - "from swan.common.utils import datetime_to_unixtime, object_to_filename\n", - "from swan.object.source_uri import SourceFilesInfo\n", - "\n", - "# r = datetime_to_unixtime('2024-03-01T22:13:23Z')\n", - "# print(r)\n", - "\n", - "# r1, r2 = object_to_filename('mine\\\\file\\\\whatever')\n", - "# print(f'dir: {r1}, file: {r2}')\n", - "\n", - "s = SourceFilesInfo()\n", - "s.mcs_connection(client)\n", - "\n", - "s.add_folder(\"swan_test\", \"\")\n", - "\n", - "print(s.to_dict())" - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "metadata": {}, - "outputs": [], - "source": [ - "import sys\n", - "sys.path.insert(0, '..')\n", - "\n", - "import os\n", - "from dotenv import load_dotenv\n", - "load_dotenv()\n", - "mcs_ak = os.getenv('MCS_API_KEY')\n", - "\n", - "from swan import MCSAPI\n", - "client = MCSAPI(mcs_ak)\n", - "\n", - "from swan.object.source_uri import Repository\n", - "\n", - "repo = Repository()\n", - "repo.add_local_dir('./api')" - ] - }, - { - "cell_type": "code", - "execution_count": 8, - "metadata": {}, - "outputs": [], - "source": [ - "res = repo.upload_local_to_mcs(\"swan_test\", \"march4t4\", client)" - ] - }, - { - "cell_type": "code", - "execution_count": 9, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "swan_test\n", - "march4t4\n" - ] - } - ], - "source": [ - "print(repo.bucket)\n", - "print(repo.path)" - ] - }, - { - "cell_type": "code", - "execution_count": 12, - "metadata": {}, - "outputs": [ - { - "name": "stderr", - "output_type": "stream", - "text": [ - "init.py: 270B [00:05, 49.4B/s] \n" - ] - } - ], - "source": [ - "# print(f'{repo.bucket}, {repo.path}')\n", - "# client.upload_file(\"swan_test\", \"march4t4/init.py\", \"./__init__.py\")\n", - "res = repo.generate_source_uri(\"swan_test\", \"mar1s2\", \"test.json\", mcs_client=client)" - ] - }, - { - "cell_type": "code", - "execution_count": 13, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "https://plutotest.acl.swanipfs.com/ipfs/QmdVrZFDRUkmAk8DuRGcfNc6WrK2kYeojVVo745Z5QSNGA\n" - ] - } - ], - "source": [ - "print(repo.source_uri)" - ] - }, - { - "cell_type": "code", - "execution_count": 23, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "['mar1s2', 'march4t2\\\\test_swan_api.py', 'march4t2', 'march4t1\\\\test_swan_api.py', 'march4t1']\n", - "['https://plutotest.acl.swanipfs.com/ipfs/QmY5fFEFdPkfA2wmNgxPLhsZkMAmLrWQvijYZdCuUHAytf', 'https://plutotest.acl.swanipfs.com/ipfs/QmcwvuhugPXeiTxD6L9RGs9pC5DqN8SZ36TZasdGE2vqeD', 'https://plutotest.acl.swanipfs.com/ipfs/', 'https://plutotest.acl.swanipfs.com/ipfs/QmcwvuhugPXeiTxD6L9RGs9pC5DqN8SZ36TZasdGE2vqeD', 'https://plutotest.acl.swanipfs.com/ipfs/']\n", - "['2024-03-04T21:35:18Z', '2024-03-04T21:30:31Z', '2024-03-04T21:30:30Z', '2024-03-04T21:27:47Z', '2024-03-04T21:27:46Z']\n", - "['QmY5fFEFdPkfA2wmNgxPLhsZkMAmLrWQvijYZdCuUHAytf', 'QmcwvuhugPXeiTxD6L9RGs9pC5DqN8SZ36TZasdGE2vqeD', '', 'QmcwvuhugPXeiTxD6L9RGs9pC5DqN8SZ36TZasdGE2vqeD', '']\n", - "[False, False, True, False, True]\n", - "{'data': {'files': [{'cid': 'QmY5fFEFdPkfA2wmNgxPLhsZkMAmLrWQvijYZdCuUHAytf', 'created_at': 1709606118.0, 'name': 'mar1s2', 'updated_at': 1709606118.0, 'url': 'https://plutotest.acl.swanipfs.com/ipfs/QmY5fFEFdPkfA2wmNgxPLhsZkMAmLrWQvijYZdCuUHAytf'}, {'cid': 'QmcwvuhugPXeiTxD6L9RGs9pC5DqN8SZ36TZasdGE2vqeD', 'created_at': 1709605831.0, 'name': 'march4t2\\\\test_swan_api.py', 'updated_at': 1709605831.0, 'url': 'https://plutotest.acl.swanipfs.com/ipfs/QmcwvuhugPXeiTxD6L9RGs9pC5DqN8SZ36TZasdGE2vqeD'}, {'cid': 'QmcwvuhugPXeiTxD6L9RGs9pC5DqN8SZ36TZasdGE2vqeD', 'created_at': 1709605667.0, 'name': 'march4t1\\\\test_swan_api.py', 'updated_at': 1709605667.0, 'url': 'https://plutotest.acl.swanipfs.com/ipfs/QmcwvuhugPXeiTxD6L9RGs9pC5DqN8SZ36TZasdGE2vqeD'}]}}\n" - ] - } - ], - "source": [ - "files = client._get_full_file_list(bucket_name=repo.bucket, prefix=repo.path)\n", - "print([file.object_name for file in files])\n", - "print([file.ipfs_url for file in files])\n", - "print([file.created_at for file in files])\n", - "print([file.payloadCid for file in files])\n", - "print([file.is_folder for file in files])\n", - "\n", - "from swan.object.source_uri import SourceFilesInfo\n", - "\n", - "s = SourceFilesInfo()\n", - "s.mcs_connection(client)\n", - "s.add_folder(\"swan_test\", \"\")\n", - "print(s.to_dict())" - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "venv", - "language": "python", - "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.8.9" - } - }, - "nbformat": 4, - "nbformat_minor": 2 -} diff --git a/test/nsfw_test.py b/test/nsfw_test.py deleted file mode 100644 index d6ba3ee6..00000000 --- a/test/nsfw_test.py +++ /dev/null @@ -1,10 +0,0 @@ -import sys -sys.path.insert(0, '..') - -import os -from dotenv import load_dotenv -load_dotenv() - -api_key = os.getenv('API_KEY') - - diff --git a/test/source.json b/test/source.json deleted file mode 100644 index 11993453..00000000 --- a/test/source.json +++ /dev/null @@ -1 +0,0 @@ -{"data": {"files": [{"cid": "68bc17f9ff2104a9d7b6777058bb4c343ca72609", "created_at": null, "name": "ZihangChenNBAI/hello/main/.gitignore", "updated_at": null, "url": "https://raw.githubusercontent.com/ZihangChenNBAI/hello/main/.gitignore"}, {"cid": "6631425230a9a1db084374f990d0640aec1f9952", "created_at": null, "name": "ZihangChenNBAI/hello/main/Dockerfile", "updated_at": null, "url": "https://raw.githubusercontent.com/ZihangChenNBAI/hello/main/Dockerfile"}, {"cid": "f4d71977a9fdc266f84d94b74aa391a04880bbd8", "created_at": null, "name": "ZihangChenNBAI/hello/main/LICENSE", "updated_at": null, "url": "https://raw.githubusercontent.com/ZihangChenNBAI/hello/main/LICENSE"}, {"cid": "2de669ef819ed4c3f5c696080d67d6603f987aaa", "created_at": null, "name": "ZihangChenNBAI/hello/main/README.md", "updated_at": null, "url": "https://raw.githubusercontent.com/ZihangChenNBAI/hello/main/README.md"}, {"cid": "0e13864fe8c5476f1acf5a37323ae9360361f500", "created_at": null, "name": "ZihangChenNBAI/hello/main/app.py", "updated_at": null, "url": "https://raw.githubusercontent.com/ZihangChenNBAI/hello/main/app.py"}, {"cid": "72f88c115dbb0d27fd4ee3063f4032336ced7301", "created_at": null, "name": "ZihangChenNBAI/hello/main/requirements.txt", "updated_at": null, "url": "https://raw.githubusercontent.com/ZihangChenNBAI/hello/main/requirements.txt"}], "owner": {"public_address": "0x61c3e03dbed55f5DE213732e816F8A8Fd6E9bfF0"}, "space": {"activeOrder": {"config": {"description": "CPU only \u00b7 2 vCPU \u00b7 2 GiB", "hardware": "CPU only", "hardware_id": 0, "hardware_type": "CPU", "memory": 2, "name": "C1ae.small", "price_per_hour": 0.0, "vcpu": 2}}, "name": "hello", "uuid": "4ce1ebea-2474-45ab-a7d9-21a912d75061"}}} \ No newline at end of file From 71f5dd4acbdbbbc9ed6d1de224916be36746ef9d Mon Sep 17 00:00:00 2001 From: Zihang Chen <159826530+ZihangChenNBAI@users.noreply.github.com> Date: Fri, 3 May 2024 15:48:07 -0400 Subject: [PATCH 03/10] updated error --- README.md | 2 +- ...e-demo.ipynb => example-demo-v0.0.2.ipynb} | 344 ++++++++--- ...=> example-deploy-v0.0.1-deprecated.ipynb} | 48 +- examples/example-deploy-v2.ipynb | 560 ------------------ swan/api_client.py | 8 +- 5 files changed, 292 insertions(+), 670 deletions(-) rename examples/{example-demo.ipynb => example-demo-v0.0.2.ipynb} (53%) rename examples/{example-deploy-v1.ipynb => example-deploy-v0.0.1-deprecated.ipynb} (69%) delete mode 100644 examples/example-deploy-v2.ipynb diff --git a/README.md b/README.md index 17c3befc..792b6f25 100644 --- a/README.md +++ b/README.md @@ -49,7 +49,7 @@ pip install swan-sdk git clone https://github.com/swanchain/orchestrator-sdk.git ``` -## Quick Start Guide SDK V2 +## Quick Start Guide for Swan SDK Jump into using the SDK with this quick example: ### 1. Get SwanHub API Key diff --git a/examples/example-demo.ipynb b/examples/example-demo-v0.0.2.ipynb similarity index 53% rename from examples/example-demo.ipynb rename to examples/example-demo-v0.0.2.ipynb index 03bc57bd..21b0af53 100644 --- a/examples/example-demo.ipynb +++ b/examples/example-demo-v0.0.2.ipynb @@ -29,21 +29,32 @@ "#### get an `API_KEY`\n", "\n", "- For test version, get `API_KEY` in dashboard page: https://orchestrator-test.swanchain.io\n", - "- For prod version, get `API_KEY` in dashboard page: https://orchestrator.swanchain.io" + "- For prod version, get `API_KEY` in dashboard page: https://orchestrator.swanchain.io\n", + "\n", + "If use this repository to test on your local machine, add `sys.path.insert(0, '..')` at the beginning, and run code in the root directory of this repository.\n", + "\n", + "You need to add environment file `.env` in your local directory, including the following parameters (`PK` is private key):\n", + "\n", + "```\n", + "API_KEY=\n", + "WALLET=\n", + "PK=\n", + "```" ] }, { "cell_type": "code", - "execution_count": 1, + "execution_count": 5, "metadata": {}, "outputs": [], "source": [ "import sys\n", - "sys.path.insert(0, '..')\n", + "sys.path.insert(0, '..') \n", "\n", "import os\n", "import time\n", "import dotenv\n", + "import json\n", "dotenv.load_dotenv()\n", "from swan import SwanAPI\n", "\n", @@ -63,20 +74,25 @@ }, { "cell_type": "code", - "execution_count": 2, + "execution_count": 6, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ - "{'client_contract_address': '0x20a67c6Bea000fAf0BE862BB254F092abF0E5b98', 'payment_contract_address': '0x5094A609Af5184d076Be2DF741820732126b4Fd2', 'rpc_url': 'https://rpc-atom-internal.swanchain.io', 'swan_token_contract_address': '0x91B25A65b295F0405552A4bbB77879ab5e38166c'}\n" + "{\n", + " \"client_contract_address\": \"0x20a67c6Bea000fAf0BE862BB254F092abF0E5b98\",\n", + " \"payment_contract_address\": \"0x5094A609Af5184d076Be2DF741820732126b4Fd2\",\n", + " \"rpc_url\": \"https://rpc-atom-internal.swanchain.io\",\n", + " \"swan_token_contract_address\": \"0x91B25A65b295F0405552A4bbB77879ab5e38166c\"\n", + "}\n" ] } ], "source": [ "r = swan_api.contract_info\n", - "print(r)" + "print(json.dumps(r, indent=2))" ] }, { @@ -88,36 +104,230 @@ }, { "cell_type": "code", - "execution_count": 3, + "execution_count": 7, "metadata": {}, "outputs": [ { "data": { "text/plain": [ - "{'id': 0,\n", - " 'name': 'C1ae.small',\n", - " 'description': 'CPU only · 2 vCPU · 2 GiB',\n", - " 'type': 'CPU',\n", - " 'reigion': ['North Carolina-US',\n", - " 'Bashkortostan Republic-RU',\n", - " 'Kyiv City-UA',\n", - " 'Kowloon City-HK',\n", - " 'Tokyo-JP',\n", - " 'California-US',\n", - " 'Central and Western District-HK',\n", - " 'Quebec-CA',\n", - " 'North West-SG',\n", - " 'Kwai Tsing-HK',\n", - " 'Bavaria-DE',\n", - " 'Saxony-DE',\n", - " 'Guangdong-CN',\n", - " 'Kowloon-HK',\n", - " 'North Rhine-Westphalia-DE'],\n", - " 'price': '0.0',\n", - " 'status': 'available'}" + "[{'id': 0,\n", + " 'name': 'C1ae.small',\n", + " 'description': 'CPU only · 2 vCPU · 2 GiB',\n", + " 'type': 'CPU',\n", + " 'reigion': ['North Carolina-US',\n", + " 'Bashkortostan Republic-RU',\n", + " 'Kyiv City-UA',\n", + " 'Kowloon City-HK',\n", + " 'Tokyo-JP',\n", + " 'California-US',\n", + " 'Central and Western District-HK',\n", + " 'Quebec-CA',\n", + " 'North West-SG',\n", + " 'Kwai Tsing-HK',\n", + " 'Bavaria-DE',\n", + " 'Saxony-DE',\n", + " 'Guangdong-CN',\n", + " 'Kowloon-HK',\n", + " 'North Rhine-Westphalia-DE'],\n", + " 'price': '0.0',\n", + " 'status': 'available'},\n", + " {'id': 1,\n", + " 'name': 'C1ae.medium',\n", + " 'description': 'CPU only · 4 vCPU · 4 GiB',\n", + " 'type': 'CPU',\n", + " 'reigion': ['North Carolina-US',\n", + " 'Bashkortostan Republic-RU',\n", + " 'Kyiv City-UA',\n", + " 'Kowloon City-HK',\n", + " 'Tokyo-JP',\n", + " 'California-US',\n", + " 'Central and Western District-HK',\n", + " 'Quebec-CA',\n", + " 'North West-SG',\n", + " 'Kwai Tsing-HK',\n", + " 'Bavaria-DE',\n", + " 'Guangdong-CN',\n", + " 'Kowloon-HK',\n", + " 'North Rhine-Westphalia-DE'],\n", + " 'price': '1.0',\n", + " 'status': 'available'},\n", + " {'id': 4,\n", + " 'name': 'M1ae.large',\n", + " 'description': 'Nvidia 3060 · 8 vCPU · 8 GiB',\n", + " 'type': 'GPU',\n", + " 'reigion': ['Kyiv City-UA'],\n", + " 'price': '4.0',\n", + " 'status': 'available'},\n", + " {'id': 6,\n", + " 'name': 'M1ae.2xlarge',\n", + " 'description': 'Nvidia 2080 Ti · 4 vCPU · 8 GiB',\n", + " 'type': 'GPU',\n", + " 'reigion': ['North Carolina-US'],\n", + " 'price': '6.0',\n", + " 'status': 'available'},\n", + " {'id': 7,\n", + " 'name': 'M1ae.3xlarge',\n", + " 'description': 'Nvidia 2080 Ti · 8 vCPU · 16 GiB',\n", + " 'type': 'GPU',\n", + " 'reigion': ['North Carolina-US'],\n", + " 'price': '6.5',\n", + " 'status': 'available'},\n", + " {'id': 12,\n", + " 'name': 'G1ae.small',\n", + " 'description': 'Nvidia 3080 · 4 vCPU · 8 GiB',\n", + " 'type': 'GPU',\n", + " 'reigion': ['North Carolina-US',\n", + " 'Kowloon City-HK',\n", + " 'Tokyo-JP',\n", + " 'California-US',\n", + " 'Quebec-CA',\n", + " 'Kwai Tsing-HK'],\n", + " 'price': '10.0',\n", + " 'status': 'available'},\n", + " {'id': 13,\n", + " 'name': 'G1ae.medium',\n", + " 'description': 'Nvidia 3080 · 8 vCPU · 16 GiB',\n", + " 'type': 'GPU',\n", + " 'reigion': ['North Carolina-US',\n", + " 'Kowloon City-HK',\n", + " 'Tokyo-JP',\n", + " 'California-US',\n", + " 'Quebec-CA',\n", + " 'Kwai Tsing-HK'],\n", + " 'price': '11.0',\n", + " 'status': 'available'},\n", + " {'id': 20,\n", + " 'name': 'Hpc1ae.small',\n", + " 'description': 'Nvidia 3090 · 4 vCPU · 8 GiB',\n", + " 'type': 'GPU',\n", + " 'reigion': ['North Carolina-US',\n", + " 'California-US',\n", + " 'Quebec-CA',\n", + " 'Guangdong-CN',\n", + " 'Kowloon-HK'],\n", + " 'price': '14.0',\n", + " 'status': 'available'},\n", + " {'id': 21,\n", + " 'name': 'Hpc1ae.medium',\n", + " 'description': 'Nvidia 3090 · 8 vCPU · 16 GiB',\n", + " 'type': 'GPU',\n", + " 'reigion': ['North Carolina-US',\n", + " 'California-US',\n", + " 'Quebec-CA',\n", + " 'Guangdong-CN',\n", + " 'Kowloon-HK'],\n", + " 'price': '16.0',\n", + " 'status': 'available'},\n", + " {'id': 24,\n", + " 'name': 'Hpc1ae.2xlarge',\n", + " 'description': 'NVIDIA A4000 · 4 vCPU · 8 GiB',\n", + " 'type': 'AI GPU',\n", + " 'reigion': ['North Carolina-US', 'North Rhine-Westphalia-DE'],\n", + " 'price': '21.0',\n", + " 'status': 'available'},\n", + " {'id': 25,\n", + " 'name': 'Hpc1ae.3xlarge',\n", + " 'description': 'NVIDIA A4000 · 8 vCPU · 16 GiB',\n", + " 'type': 'AI GPU',\n", + " 'reigion': ['North Carolina-US', 'North Rhine-Westphalia-DE'],\n", + " 'price': '21.0',\n", + " 'status': 'available'},\n", + " {'id': 27,\n", + " 'name': 'T1ae.medium',\n", + " 'description': 'Nvidia 2080 Ti · 12 vCPU · 64 GiB',\n", + " 'type': 'GPU',\n", + " 'reigion': ['North Carolina-US'],\n", + " 'price': '36.0',\n", + " 'status': 'available'},\n", + " {'id': 32,\n", + " 'name': 'Hpc2ae.small',\n", + " 'description': 'Nvidia 4090 · 4 vCPU · 8 GiB',\n", + " 'type': 'AI GPU',\n", + " 'reigion': ['Bavaria-DE', 'Tokyo-JP', 'Bashkortostan Republic-RU'],\n", + " 'price': '22.0',\n", + " 'status': 'available'},\n", + " {'id': 33,\n", + " 'name': 'Hpc2ae.medium',\n", + " 'description': 'Nvidia 4090 · 8 vCPU · 16 GiB',\n", + " 'type': 'AI GPU',\n", + " 'reigion': ['Bavaria-DE', 'Tokyo-JP', 'Bashkortostan Republic-RU'],\n", + " 'price': '24.0',\n", + " 'status': 'available'},\n", + " {'id': 42,\n", + " 'name': 'T1az.2xlarge',\n", + " 'description': 'Nvidia 4090 · 8 vCPU · 64 GiB',\n", + " 'type': 'AI GPU',\n", + " 'reigion': ['Bavaria-DE', 'Tokyo-JP', 'Bashkortostan Republic-RU'],\n", + " 'price': '60.0',\n", + " 'status': 'available'},\n", + " {'id': 44,\n", + " 'name': 'T1az.4xlarge',\n", + " 'description': 'Nvidia A4000 · 8 vCPU · 64 GiB',\n", + " 'type': 'AI GPU',\n", + " 'reigion': ['North Carolina-US'],\n", + " 'price': '65.0',\n", + " 'status': 'available'},\n", + " {'id': 53,\n", + " 'name': 'T2az.2xlarge',\n", + " 'description': 'Nvidia 4090 · 12 vCPU · 128 GiB',\n", + " 'type': 'AI GPU',\n", + " 'reigion': ['Bavaria-DE', 'Tokyo-JP'],\n", + " 'price': '70.0',\n", + " 'status': 'available'},\n", + " {'id': 55,\n", + " 'name': 'T2az.4xlarge',\n", + " 'description': 'Nvidia A4000 · 12 vCPU · 128 GiB',\n", + " 'type': 'AI GPU',\n", + " 'reigion': ['North Carolina-US'],\n", + " 'price': '75.0',\n", + " 'status': 'available'},\n", + " {'id': 72,\n", + " 'name': 'R1ae.small',\n", + " 'description': 'Nvidia 2080 TI · 8 vCPU · 32 GiB',\n", + " 'type': 'GPU',\n", + " 'reigion': ['North Carolina-US'],\n", + " 'price': '12.0',\n", + " 'status': 'available'},\n", + " {'id': 73,\n", + " 'name': 'R1ae.medium',\n", + " 'description': 'Nvidia 3080 · 8 vCPU · 32 GiB',\n", + " 'type': 'GPU',\n", + " 'reigion': ['North Carolina-US',\n", + " 'Kowloon City-HK',\n", + " 'Tokyo-JP',\n", + " 'California-US',\n", + " 'Quebec-CA',\n", + " 'Kwai Tsing-HK'],\n", + " 'price': '22.0',\n", + " 'status': 'available'},\n", + " {'id': 74,\n", + " 'name': 'R1ae.large',\n", + " 'description': 'Nvidia 3090 · 8 vCPU · 32 GiB',\n", + " 'type': 'GPU',\n", + " 'reigion': ['North Carolina-US',\n", + " 'California-US',\n", + " 'Quebec-CA',\n", + " 'Guangdong-CN',\n", + " 'Kowloon-HK'],\n", + " 'price': '30.0',\n", + " 'status': 'available'},\n", + " {'id': 77,\n", + " 'name': 'R2ae.large',\n", + " 'description': 'Nvidia 4090 · 8 vCPU · 32 GiB',\n", + " 'type': 'AI GPU',\n", + " 'reigion': ['Bavaria-DE', 'Tokyo-JP', 'Bashkortostan Republic-RU'],\n", + " 'price': '50.0',\n", + " 'status': 'available'},\n", + " {'id': 78,\n", + " 'name': 'R2ae.xlarge',\n", + " 'description': 'Nvidia A4000 · 8 vCPU · 32 GiB',\n", + " 'type': 'AI GPU',\n", + " 'reigion': ['North Carolina-US'],\n", + " 'price': '52.0',\n", + " 'status': 'available'}]" ] }, - "execution_count": 3, + "execution_count": 7, "metadata": {}, "output_type": "execute_result" } @@ -141,7 +351,7 @@ }, { "cell_type": "code", - "execution_count": 6, + "execution_count": 8, "metadata": {}, "outputs": [ { @@ -164,7 +374,7 @@ }, { "cell_type": "code", - "execution_count": 15, + "execution_count": 9, "metadata": {}, "outputs": [ { @@ -173,7 +383,7 @@ "1" ] }, - "execution_count": 15, + "execution_count": 9, "metadata": {}, "output_type": "execute_result" } @@ -193,13 +403,13 @@ }, { "cell_type": "code", - "execution_count": 7, + "execution_count": 10, "metadata": {}, "outputs": [], "source": [ "\n", "job_source_uri = swan_api.get_source_uri(\n", - " repo_uri='https://github.com/alphaflows/tetris-docker-image',\n", + " repo_uri='https://github.com/alphaflows/tetris-docker-image.git',\n", " hardware_id=hardware_id,\n", " wallet_address=os.getenv('WALLET')\n", ")" @@ -207,7 +417,7 @@ }, { "cell_type": "code", - "execution_count": 8, + "execution_count": 11, "metadata": {}, "outputs": [], "source": [ @@ -217,16 +427,16 @@ }, { "cell_type": "code", - "execution_count": 9, + "execution_count": 12, "metadata": {}, "outputs": [ { "data": { "text/plain": [ - "'https://data.mcs.lagrangedao.org/ipfs/QmSc5G8YdR4d6WQFJm844FqS4F8qAGWpo5YXr5auVamh9M'" + "'https://data.mcs.lagrangedao.org/ipfs/QmRGyt2gSvUkM8UePz5aeTW2tmw2MQrS4oXPVWLwHufqyN'" ] }, - "execution_count": 9, + "execution_count": 12, "metadata": {}, "output_type": "execute_result" } @@ -244,7 +454,7 @@ }, { "cell_type": "code", - "execution_count": 19, + "execution_count": 13, "metadata": {}, "outputs": [ { @@ -272,7 +482,7 @@ }, { "cell_type": "code", - "execution_count": 20, + "execution_count": 14, "metadata": {}, "outputs": [ { @@ -282,15 +492,15 @@ "{\n", " \"data\": {\n", " \"task\": {\n", - " \"created_at\": \"1714254304\",\n", - " \"end_at\": \"1714257898\",\n", + " \"created_at\": \"1714763824\",\n", + " \"end_at\": \"1714767416\",\n", " \"leading_job_id\": null,\n", " \"refund_amount\": null,\n", " \"status\": \"initialized\",\n", - " \"task_detail_cid\": \"https://data.mcs.lagrangedao.org/ipfs/QmXLSaBqtoWZWAUoiYxM3EDxh14kkhpUiYkVjZSK3BhfKj\",\n", + " \"task_detail_cid\": \"https://data.mcs.lagrangedao.org/ipfs/QmWkquL27Tss4GWGs5nzPnijV1gZmF5FSxA8znTQmvmn3B\",\n", " \"tx_hash\": null,\n", - " \"updated_at\": \"1714254304\",\n", - " \"uuid\": \"f4799212-4dc2-4c0b-9209-c0ac7bc48442\"\n", + " \"updated_at\": \"1714763824\",\n", + " \"uuid\": \"8d19975e-cbfc-4423-a580-2bd1ba7aac1e\"\n", " }\n", " },\n", " \"message\": \"Task_uuid initialized.\",\n", @@ -300,8 +510,6 @@ } ], "source": [ - "import json\n", - "\n", "duration=3600*duration_hour\n", "\n", "result = swan_api.create_task(\n", @@ -328,14 +536,14 @@ }, { "cell_type": "code", - "execution_count": 21, + "execution_count": 15, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ - "0x6aa4b358a4febbc00f4cc44d24ff3a02244f888aa1e569913f8f5e9d541cbb2d\n" + "0x27205cc0eeb4a2283af518c9aa5e05f436b829b0dc51ec40d08652168651b9d7\n" ] } ], @@ -355,26 +563,26 @@ }, { "cell_type": "code", - "execution_count": 22, + "execution_count": 16, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ - "{'tx_hash': '0x6aa4b358a4febbc00f4cc44d24ff3a02244f888aa1e569913f8f5e9d541cbb2d', 'task_uuid': 'f4799212-4dc2-4c0b-9209-c0ac7bc48442'}\n", + "{'tx_hash': '0x27205cc0eeb4a2283af518c9aa5e05f436b829b0dc51ec40d08652168651b9d7', 'task_uuid': '8d19975e-cbfc-4423-a580-2bd1ba7aac1e'}\n", "{\n", " \"data\": {\n", " \"task\": {\n", - " \"created_at\": \"1714254304\",\n", - " \"end_at\": \"1714257898\",\n", + " \"created_at\": \"1714763824\",\n", + " \"end_at\": \"1714767416\",\n", " \"leading_job_id\": null,\n", " \"refund_amount\": null,\n", " \"status\": \"created\",\n", - " \"task_detail_cid\": \"https://data.mcs.lagrangedao.org/ipfs/QmXLSaBqtoWZWAUoiYxM3EDxh14kkhpUiYkVjZSK3BhfKj\",\n", + " \"task_detail_cid\": \"https://data.mcs.lagrangedao.org/ipfs/QmWkquL27Tss4GWGs5nzPnijV1gZmF5FSxA8znTQmvmn3B\",\n", " \"tx_hash\": null,\n", - " \"updated_at\": \"1714254312\",\n", - " \"uuid\": \"f4799212-4dc2-4c0b-9209-c0ac7bc48442\"\n", + " \"updated_at\": \"1714763841\",\n", + " \"uuid\": \"8d19975e-cbfc-4423-a580-2bd1ba7aac1e\"\n", " }\n", " },\n", " \"message\": \"Task payment validated successfully.\",\n", @@ -403,7 +611,7 @@ }, { "cell_type": "code", - "execution_count": 23, + "execution_count": 17, "metadata": {}, "outputs": [ { @@ -415,18 +623,18 @@ " \"computing_providers\": [],\n", " \"jobs\": [],\n", " \"task\": {\n", - " \"created_at\": \"1714254304\",\n", - " \"end_at\": \"1714257898\",\n", + " \"created_at\": \"1714763824\",\n", + " \"end_at\": \"1714767416\",\n", " \"leading_job_id\": null,\n", " \"refund_amount\": null,\n", - " \"status\": \"created\",\n", - " \"task_detail_cid\": \"https://data.mcs.lagrangedao.org/ipfs/QmXLSaBqtoWZWAUoiYxM3EDxh14kkhpUiYkVjZSK3BhfKj\",\n", + " \"status\": \"accepting_bids\",\n", + " \"task_detail_cid\": \"https://data.mcs.lagrangedao.org/ipfs/QmWkquL27Tss4GWGs5nzPnijV1gZmF5FSxA8znTQmvmn3B\",\n", " \"tx_hash\": null,\n", - " \"updated_at\": \"1714254312\",\n", - " \"uuid\": \"f4799212-4dc2-4c0b-9209-c0ac7bc48442\"\n", + " \"updated_at\": \"1714763861\",\n", + " \"uuid\": \"8d19975e-cbfc-4423-a580-2bd1ba7aac1e\"\n", " }\n", " },\n", - " \"message\": \"fetch task info for task_uuid='f4799212-4dc2-4c0b-9209-c0ac7bc48442' successfully\",\n", + " \"message\": \"fetch task info for task_uuid='8d19975e-cbfc-4423-a580-2bd1ba7aac1e' successfully\",\n", " \"status\": \"success\"\n", "}\n" ] @@ -450,14 +658,14 @@ }, { "cell_type": "code", - "execution_count": 28, + "execution_count": null, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ - "['https://1aql2r340t.computing.nebulablock.com', 'https://e958tag7ox.lag.nebulablock.com', 'https://hygltuf3x5.computing.storefrontiers.cn']\n" + "['https://077dupt8wa.cp.filezoo.com.cn']\n" ] } ], @@ -468,7 +676,7 @@ }, { "cell_type": "code", - "execution_count": 25, + "execution_count": null, "metadata": {}, "outputs": [ { @@ -478,7 +686,7 @@ "traceback": [ "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", "\u001b[0;31mIndexError\u001b[0m Traceback (most recent call last)", - "Cell \u001b[0;32mIn[25], line 8\u001b[0m\n\u001b[1;32m 2\u001b[0m \u001b[38;5;28;01mimport\u001b[39;00m \u001b[38;5;21;01mjson\u001b[39;00m\n\u001b[1;32m 4\u001b[0m headers \u001b[38;5;241m=\u001b[39m {\n\u001b[1;32m 5\u001b[0m \u001b[38;5;124m'\u001b[39m\u001b[38;5;124mContent-Type\u001b[39m\u001b[38;5;124m'\u001b[39m: \u001b[38;5;124m'\u001b[39m\u001b[38;5;124mapplication/json\u001b[39m\u001b[38;5;124m'\u001b[39m,\n\u001b[1;32m 6\u001b[0m }\n\u001b[0;32m----> 8\u001b[0m response \u001b[38;5;241m=\u001b[39m requests\u001b[38;5;241m.\u001b[39mget(\u001b[43mr\u001b[49m\u001b[43m[\u001b[49m\u001b[38;5;241;43m0\u001b[39;49m\u001b[43m]\u001b[49m, headers\u001b[38;5;241m=\u001b[39mheaders)\n\u001b[1;32m 10\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m 11\u001b[0m \u001b[38;5;28mprint\u001b[39m(json\u001b[38;5;241m.\u001b[39mdumps(response\u001b[38;5;241m.\u001b[39mjson(), indent\u001b[38;5;241m=\u001b[39m\u001b[38;5;241m4\u001b[39m))\n", + "Cell \u001b[0;32mIn[52], line 8\u001b[0m\n\u001b[1;32m 2\u001b[0m \u001b[38;5;28;01mimport\u001b[39;00m \u001b[38;5;21;01mjson\u001b[39;00m\n\u001b[1;32m 4\u001b[0m headers \u001b[38;5;241m=\u001b[39m {\n\u001b[1;32m 5\u001b[0m \u001b[38;5;124m'\u001b[39m\u001b[38;5;124mContent-Type\u001b[39m\u001b[38;5;124m'\u001b[39m: \u001b[38;5;124m'\u001b[39m\u001b[38;5;124mapplication/json\u001b[39m\u001b[38;5;124m'\u001b[39m,\n\u001b[1;32m 6\u001b[0m }\n\u001b[0;32m----> 8\u001b[0m response \u001b[38;5;241m=\u001b[39m requests\u001b[38;5;241m.\u001b[39mget(\u001b[43mr\u001b[49m\u001b[43m[\u001b[49m\u001b[38;5;241;43m0\u001b[39;49m\u001b[43m]\u001b[49m, headers\u001b[38;5;241m=\u001b[39mheaders)\n\u001b[1;32m 10\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m 11\u001b[0m \u001b[38;5;28mprint\u001b[39m(json\u001b[38;5;241m.\u001b[39mdumps(response\u001b[38;5;241m.\u001b[39mjson(), indent\u001b[38;5;241m=\u001b[39m\u001b[38;5;241m4\u001b[39m))\n", "\u001b[0;31mIndexError\u001b[0m: list index out of range" ] } diff --git a/examples/example-deploy-v1.ipynb b/examples/example-deploy-v0.0.1-deprecated.ipynb similarity index 69% rename from examples/example-deploy-v1.ipynb rename to examples/example-deploy-v0.0.1-deprecated.ipynb index 6fe2c4c3..7cfcdcd7 100644 --- a/examples/example-deploy-v1.ipynb +++ b/examples/example-deploy-v0.0.1-deprecated.ipynb @@ -6,7 +6,7 @@ "source": [ "## Example for SDK\n", "\n", - "This example shows how to use SDK to deploy a task (V1)\n", + "This example shows how to use SDK to deploy a task (deprecated version)\n", "\n", "### Initialization" ] @@ -33,8 +33,8 @@ "# get user api_key in dashboard page: https://orchestrator-test.swanchain.io/provider-status\n", "swan_api = SwanAPI(api_key=os.getenv(\"API_KEY\"), environment=dev_url)\n", "\n", - "api_key = os.getenv(\"MCS_API_KEY\")\n", - "mcs_api = MCSAPI(api_key)" + "# api_key = os.getenv(\"MCS_API_KEY\")\n", + "# mcs_api = MCSAPI(api_key)" ] }, { @@ -72,7 +72,7 @@ "output_type": "stream", "text": [ "1\n", - "('C1ae.medium', 1, ['-', 'North Carolina-US'])\n" + "('C1ae.medium', 1, ['Quebec-CA', 'North Carolina-US'])\n" ] } ], @@ -142,8 +142,8 @@ "name": "stdout", "output_type": "stream", "text": [ - "0xf2bb683e0b3f933bf4d1582a3ca4ea158904dab2c26c7d6d78cc6aded2cefd04\n", - "0xc391e15ec0ac1f80078b965778b6b87cb8a2582be1805a98553142e6b24e92dd\n" + "0x800bbde44740faa1d44b1cf9c3b1a3b629b43742c427fdd743a0ec8e571ad703\n", + "0x0682f87926cd8db6ce3465772bb36f3ae09696443d9a065f5403e0c3e2fe7ce1\n" ] } ], @@ -173,29 +173,16 @@ "name": "stderr", "output_type": "stream", "text": [ - "ERROR:root:SwanAPIRequestException: No C1ae.medium machine in Quebec-CA.Traceback (most recent call last):\n", - " File \"/Users/zihangchen/Documents/work_repos/orchestrator-sdk/examples/../swan/api/swan_api.py\", line 151, in deploy_task\n", - " raise SwanAPIException(f\"No {cfg_name} machine in {region}.\")\n", - "swan.common.exception.SwanAPIException: SwanAPIRequestException: No C1ae.medium machine in Quebec-CA.\n", - "\n" + "/var/folders/f1/12wnv55x7vz6rr1ts9l6lytm0000gn/T/ipykernel_38565/44368673.py:2: DeprecationWarning: Call to deprecated method deploy_task. (This API will be removed in the future version.) -- Deprecated since version 0.0.2.\n", + " result = swan_api.deploy_task(\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ - "None\n" - ] - }, - { - "ename": "TypeError", - "evalue": "'NoneType' object is not subscriptable", - "output_type": "error", - "traceback": [ - "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", - "\u001b[0;31mTypeError\u001b[0m Traceback (most recent call last)", - "Cell \u001b[0;32mIn[7], line 12\u001b[0m\n\u001b[1;32m 2\u001b[0m result \u001b[38;5;241m=\u001b[39m swan_api\u001b[38;5;241m.\u001b[39mdeploy_task(\n\u001b[1;32m 3\u001b[0m cfg_name\u001b[38;5;241m=\u001b[39mdevice, \n\u001b[1;32m 4\u001b[0m region\u001b[38;5;241m=\u001b[39m\u001b[38;5;124m'\u001b[39m\u001b[38;5;124mQuebec-CA\u001b[39m\u001b[38;5;124m'\u001b[39m, \n\u001b[0;32m (...)\u001b[0m\n\u001b[1;32m 9\u001b[0m wallet_address\u001b[38;5;241m=\u001b[39mos\u001b[38;5;241m.\u001b[39mgetenv(\u001b[38;5;124m'\u001b[39m\u001b[38;5;124mWALLET\u001b[39m\u001b[38;5;124m'\u001b[39m),\n\u001b[1;32m 10\u001b[0m )\n\u001b[1;32m 11\u001b[0m \u001b[38;5;28mprint\u001b[39m(result)\n\u001b[0;32m---> 12\u001b[0m task_uuid \u001b[38;5;241m=\u001b[39m \u001b[43mresult\u001b[49m\u001b[43m[\u001b[49m\u001b[38;5;124;43m'\u001b[39;49m\u001b[38;5;124;43mdata\u001b[39;49m\u001b[38;5;124;43m'\u001b[39;49m\u001b[43m]\u001b[49m[\u001b[38;5;124m'\u001b[39m\u001b[38;5;124mtask\u001b[39m\u001b[38;5;124m'\u001b[39m][\u001b[38;5;124m'\u001b[39m\u001b[38;5;124muuid\u001b[39m\u001b[38;5;124m'\u001b[39m]\n\u001b[1;32m 13\u001b[0m \u001b[38;5;28mprint\u001b[39m(\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mTask UUID:\u001b[39m\u001b[38;5;124m\"\u001b[39m, task_uuid)\n", - "\u001b[0;31mTypeError\u001b[0m: 'NoneType' object is not subscriptable" + "{'data': {'task': {'created_at': '1714764579', 'end_at': '1714768172', 'leading_job_id': None, 'refund_amount': None, 'status': 'created', 'task_detail_cid': 'https://plutotest.acl.swanipfs.com/ipfs/QmcSoSW6MvSyNQesEMTJLSPWEpngCVwyXfbnN12bmbtN2p', 'tx_hash': None, 'updated_at': '1714764579', 'uuid': '226ea8b5-8d91-4470-8002-68fbd89232a2'}}, 'message': 'Task_uuid created.', 'status': 'success'}\n", + "Task UUID: 226ea8b5-8d91-4470-8002-68fbd89232a2\n" ] } ], @@ -226,20 +213,7 @@ "cell_type": "code", "execution_count": null, "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "buildImage\n", - "Job Result URL: https://42f6d9f62851.acl.swanipfs.com/ipfs/QmQEW9Dyi9aEQYtVdMxA7qkr7gdiiNE9MJxNtjt9uSyUKc\n", - "Job Real URL: https://cf24tc2oot.dev2.crosschain.computer\n", - "deployToK8s\n", - "Job Result URL: https://42f6d9f62851.acl.swanipfs.com/ipfs/QmQEW9Dyi9aEQYtVdMxA7qkr7gdiiNE9MJxNtjt9uSyUKc\n", - "Job Real URL: https://cf24tc2oot.dev2.crosschain.computer\n" - ] - } - ], + "outputs": [], "source": [ "# Check task info\n", "while True:\n", diff --git a/examples/example-deploy-v2.ipynb b/examples/example-deploy-v2.ipynb deleted file mode 100644 index 786d67e8..00000000 --- a/examples/example-deploy-v2.ipynb +++ /dev/null @@ -1,560 +0,0 @@ -{ - "cells": [ - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Example for SDK\n", - "\n", - "This example shows how to use SDK to deploy a task (V2)\n", - "\n", - "### Initialization\n", - "\n", - "For test version, get a user `API_KEY` in dashboard page: https://orchestrator-test.swanchain.io/provider-status" - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "metadata": {}, - "outputs": [], - "source": [ - "import os\n", - "import time\n", - "import dotenv\n", - "dotenv.load_dotenv(\"../.env\")\n", - "from swan import SwanAPI , MCSAPI\n", - "\n", - "# Initialize the Swan Service\n", - "swan_api = SwanAPI(api_key=os.getenv(\"API_KEY\"), environment=\"https://swanhub-cali.swanchain.io\")\n", - "\n", - "api_key = os.getenv(\"MCS_API_KEY\")\n", - "mcs_api = MCSAPI(api_key)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Available hardware information" - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "metadata": {}, - "outputs": [], - "source": [ - "hardwares = swan_api.get_hardware_config()\n", - "hardwares_info = [hardware.to_dict() for hardware in hardwares if hardware.status == \"available\"] \n", - "# hardwares_info" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "choose hardware config" - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "1\n", - "('C1ae.medium', 1, ['North Carolina-US', 'Quebec-CA'])\n" - ] - } - ], - "source": [ - "device = 'C1ae.medium' #\"G1ae.medium\"\n", - "obj = [hardware for hardware in hardwares if hardware.name == device][0]\n", - "print(obj.id)\n", - "print([(hardware.name, hardware.id, hardware.region) for hardware in hardwares if hardware.name == device][0])" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "to simplify the process, here we use a existing `job_source_uri` which is a hello world application, used to create task." - ] - }, - { - "cell_type": "code", - "execution_count": 4, - "metadata": {}, - "outputs": [], - "source": [ - "job_source_uri = 'https://test-api.lagrangedao.org/spaces/5117e998-c623-4837-8af9-2b7b0ce2de7f'" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Define task deploy v2" - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": {}, - "outputs": [], - "source": [ - "# Deploy task\n", - "\n", - "# before v2 integrated into SDK, use customed function instead\n", - "import logging\n", - "import traceback\n", - "import json\n", - "from swan.common.constant import *\n", - "from swan.common.exception import SwanAPIException\n", - "\n", - "def deploy_task_v2(\n", - " cfg_name: str, \n", - " region: str, \n", - " start_in: int, \n", - " duration: int, \n", - " job_source_uri: str, \n", - " wallet_address: str, \n", - " paid: float = 0.0\n", - " ):\n", - " \"\"\"Sent deploy task request via orchestrator.\n", - "\n", - " Args:\n", - " cfg_name: name of cp/hardware configuration set.\n", - " region: region of hardware.\n", - " start_in: unix timestamp of starting time.\n", - " duration: duration of service runtime in unix time.\n", - " job_source_uri: source uri for space.\n", - " wallet_address: user wallet address.\n", - " paid: paid amount in Eth.\n", - "\n", - " Returns:\n", - " JSON response from backend server including 'task_uuid'.\n", - " \"\"\"\n", - " try:\n", - " if swan_api._verify_hardware_region(cfg_name, region):\n", - " params = {\n", - " \"paid\": paid,\n", - " \"duration\": duration,\n", - " \"cfg_name\": cfg_name,\n", - " \"region\": region,\n", - " \"start_in\": start_in,\n", - " \"wallet\": wallet_address,\n", - " \"job_source_uri\": job_source_uri\n", - " }\n", - " result = swan_api._request_with_params(\n", - " POST, \n", - " '/v2/task_deployment', \n", - " swan_api.swan_url, \n", - " params, \n", - " swan_api.token, \n", - " None\n", - " )\n", - " return result\n", - " else:\n", - " raise SwanAPIException(f\"No {cfg_name} machine in {region}.\")\n", - " except Exception as e:\n", - " logging.error(str(e) + traceback.format_exc())\n", - " return None\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Define contract util class v2" - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "metadata": {}, - "outputs": [], - "source": [ - "# ./swan/contract/swan_contract_ex.py\n", - "\n", - "\n", - "from swan.common.constant import *\n", - "from swan.common.utils import get_contract_abi\n", - "\n", - "from swan.contract.swan_contract import SwanContract\n", - "\n", - "\n", - "CLIENT_CONTRACT_ADDRESS=\"0xe356a758fA1748dfBE71E989c876959665a66ddA\"\n", - "_CLIENT_CONTRACT_ABI = \"../swan/contract/abi/ClientPayment.json\"\n", - "\n", - "class SwanContractEx(SwanContract):\n", - "\n", - " def __init__(self, private_key: str, rpc_url: str):\n", - " \"\"\" Initialize swan contract API connection.\n", - "\n", - " Args:\n", - " private_key: private key for wallet.\n", - " rpc_url: rpc url of swan chain for connection.\n", - " \"\"\"\n", - "\n", - " # with open(_CLIENT_CONTRACT_ABI, 'r') as abi_file:\n", - " # abi_data = json.load(abi_file)\n", - " # client_abi = json.dumps(abi_data)\n", - " \n", - " \n", - " super().__init__(private_key=private_key, rpc_url=rpc_url)\n", - " self.client_contract = self.w3.eth.contract(\n", - " address=CLIENT_CONTRACT_ADDRESS, \n", - " abi=json.load(open(_CLIENT_CONTRACT_ABI))\n", - " )\n", - "\n", - " def submit_payment(self, task_id: str, hardware_id: int, duration: int):\n", - " nonce = self.w3.eth.get_transaction_count(self.account.address)\n", - " base_fee = self.w3.eth.get_block('latest')['baseFeePerGas']\n", - " max_priority_fee_per_gas = self.w3.to_wei(2, 'gwei')\n", - " max_fee_per_gas = base_fee + max_priority_fee_per_gas\n", - " if max_fee_per_gas < max_priority_fee_per_gas:\n", - " max_fee_per_gas = max_priority_fee_per_gas + base_fee\n", - " tx = self.client_contract.functions.submitPayment(task_id, hardware_id, duration).build_transaction({\n", - " 'from': self.account.address,\n", - " 'nonce': nonce,\n", - " \"maxFeePerGas\": max_fee_per_gas,\n", - " \"maxPriorityFeePerGas\": max_priority_fee_per_gas,\n", - " })\n", - " signed_tx = self.w3.eth.account.sign_transaction(tx, self.account._private_key)\n", - " tx_hash = self.w3.eth.send_raw_transaction(signed_tx.rawTransaction)\n", - " self.w3.eth.wait_for_transaction_receipt(tx_hash, timeout=CONTRACT_TIMEOUT)\n", - " return self.w3.to_hex(tx_hash)\n", - " \n", - " def _approve_swan_token(self, amount):\n", - " nonce = self.w3.eth.get_transaction_count(self.account.address)\n", - " base_fee = self.w3.eth.get_block('latest')['baseFeePerGas']\n", - " max_priority_fee_per_gas = self.w3.to_wei(2, 'gwei')\n", - " max_fee_per_gas = base_fee + max_priority_fee_per_gas\n", - " if max_fee_per_gas < max_priority_fee_per_gas:\n", - " max_fee_per_gas = max_priority_fee_per_gas + base_fee\n", - " tx = self.token_contract.functions.approve(self.client_contract.address, amount).build_transaction({\n", - " 'from': self.account.address,\n", - " 'nonce': nonce,\n", - " \"maxFeePerGas\": max_fee_per_gas,\n", - " \"maxPriorityFeePerGas\": max_priority_fee_per_gas,\n", - " })\n", - " signed_tx = self.w3.eth.account.sign_transaction(tx, self.account._private_key)\n", - " tx_hash = self.w3.eth.send_raw_transaction(signed_tx.rawTransaction)\n", - " self.w3.eth.wait_for_transaction_receipt(tx_hash, timeout=CONTRACT_TIMEOUT)\n", - " return self.w3.to_hex(tx_hash)\n", - " " - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Contract and Payment Estimation\n", - "\n", - "- Firstly, use contract function to estimate the amount to pay. \n", - "- Secondly, by `task_uuid` gotten from v2 API `/v2/task_deployment`, **pay** with `task_uuid` and `hardware_id`\n", - "- Thirdly, do payment validation via v2 API `/v2/task_payment_validate`, which will enable task eligible for assigning" - ] - }, - { - "cell_type": "code", - "execution_count": 7, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "1000000000000000000\n" - ] - } - ], - "source": [ - "pk = os.getenv('PK')\n", - "rpc = os.getenv('RPC')\n", - "\n", - "c2 = SwanContractEx(pk, rpc)\n", - "duration_hour = 1 # hour\n", - "amount = c2.estimate_payment(obj.id, duration_hour)\n", - "print(amount)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Deploy task\n", - "\n", - "This step shows how to use SDK's interface for deploying task, which calls Orchestrator's task deployment API (V2), to get `task_uuid`, which will be used in payment." - ] - }, - { - "cell_type": "code", - "execution_count": 8, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "{'data': {'task': {'created_at': '1713377715', 'end_at': '1713381312', 'leading_job_id': None, 'refund_amount': None, 'status': 'initialized', 'task_detail_cid': 'https://plutotest.acl.swanipfs.com/ipfs/QmUvHQNGETErt6MP7PMx6525ZgKKoe19zyt6oBkVR9kvzU', 'tx_hash': None, 'updated_at': '1713377715', 'uuid': '0242bdf6-45bb-4743-b081-f86d4f702bc0'}}, 'message': 'Task_uuid initialized.', 'status': 'success'}\n", - "Task UUID: 0242bdf6-45bb-4743-b081-f86d4f702bc0\n" - ] - } - ], - "source": [ - "duration=3600*duration_hour\n", - "\n", - "result = deploy_task_v2(\n", - " cfg_name=device, \n", - " region='Quebec-CA', \n", - " start_in=5, \n", - " duration=duration, \n", - " job_source_uri=job_source_uri, \n", - " paid=c2._wei_to_swan(amount),\n", - " wallet_address=os.getenv('WALLET'),\n", - " )\n", - "print(result)\n", - "task_uuid = result['data']['task']['uuid']\n", - "print(\"Task UUID:\", task_uuid)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Submit Payment\n", - "\n", - "This step is using `task_uuid`, `hardware_id` and `duration` to submit payment via **ClientPayment** contract." - ] - }, - { - "cell_type": "code", - "execution_count": 9, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "0x30b0bce0f724fc7fc56e89c2e79c520cc68b5376246212d64b8427b25de6138e\n", - "0x4fcabb68f20381cc8f10fe5873e7c544ec0500fb4b84542230e21db803fd1336\n" - ] - } - ], - "source": [ - "r = c2._approve_swan_token(amount)\n", - "print(r)\n", - " \n", - "tx_hash = c2.submit_payment(task_uuid, obj.id, duration)\n", - "print(tx_hash)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Validate Payment via API\n", - "\n", - "This step will validate the payment and then make task eligible for assigning if validation successful" - ] - }, - { - "cell_type": "code", - "execution_count": 31, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "0x70d958a1bb0d43b88215674789dbdf142b9d3bb512c1a5345dd8e09dcb6c09ce\n", - "b35d5ce8-2dcb-4e67-88de-f48d89487418\n", - "{'tx_hash': '0x70d958a1bb0d43b88215674789dbdf142b9d3bb512c1a5345dd8e09dcb6c09ce', 'task_uuid': 'b35d5ce8-2dcb-4e67-88de-f48d89487418'}\n", - "{'data': {'error_code': 1131}, 'message': 'payment validation failed: payment receipt contract address is not correct', 'status': 'failed'}\n" - ] - } - ], - "source": [ - "\n", - "def validate_payment(\n", - " tx_hash,\n", - " task_uuid\n", - " ):\n", - " \n", - " print(tx_hash)\n", - " print(task_uuid)\n", - "\n", - " try:\n", - " if tx_hash and task_uuid:\n", - " params = {\n", - " \"tx_hash\": tx_hash,\n", - " \"task_uuid\": task_uuid\n", - " }\n", - " print(params)\n", - " result = swan_api._request_with_params(\n", - " POST, \n", - " '/v2/task_payment_validate', \n", - " swan_api.swan_url, \n", - " params, \n", - " os.getenv(\"API_KEY\"), #swan_api.token, \n", - " None\n", - " )\n", - " return result\n", - " else:\n", - " raise SwanAPIException(f\"{tx_hash=} or {task_uuid=} invalid\")\n", - " except Exception as e:\n", - " logging.error(str(e) + traceback.format_exc())\n", - " return None\n", - "\n", - "result_validation = validate_payment(\n", - " tx_hash=tx_hash,\n", - " task_uuid=task_uuid\n", - ")\n", - "print(result_validation)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The following step is optional, shows information when waiting for task being deployed." - ] - }, - { - "cell_type": "code", - "execution_count": 9, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "buildImage\n", - "Job Result URL: https://42f6d9f62851.acl.swanipfs.com/ipfs/QmQEW9Dyi9aEQYtVdMxA7qkr7gdiiNE9MJxNtjt9uSyUKc\n", - "Job Real URL: https://cf24tc2oot.dev2.crosschain.computer\n", - "deployToK8s\n", - "Job Result URL: https://42f6d9f62851.acl.swanipfs.com/ipfs/QmQEW9Dyi9aEQYtVdMxA7qkr7gdiiNE9MJxNtjt9uSyUKc\n", - "Job Real URL: https://cf24tc2oot.dev2.crosschain.computer\n" - ] - } - ], - "source": [ - "# Check task info\n", - "while True:\n", - " info = swan_api.get_deployment_info(task_uuid=task_uuid)\n", - " if len(info['data']['jobs']) > 0:\n", - " \n", - " status = info['data']['jobs'][0]['status']\n", - " print(status)\n", - " \n", - " job_res_uri = info['data']['jobs'][0]['job_result_uri']\n", - " job_real_uri = info['data']['jobs'][0]['job_real_uri']\n", - " print(\"Job Result URL: \", job_res_uri)\n", - " print(\"Job Real URL: \", job_real_uri)\n", - " \n", - " # break\n", - " if status == 'deployToK8s' or status == \"Cancelled\" or status == \"Failed\":\n", - " break\n", - " \n", - " time.sleep(30)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Show result\n", - "\n", - "`job_real_uri` is for show the result of application you deployed." - ] - }, - { - "cell_type": "code", - "execution_count": 10, - "metadata": {}, - "outputs": [ - { - "name": "stderr", - "output_type": "stream", - "text": [ - "ERROR:root:'NoneType' object is not subscriptableTraceback (most recent call last):\n", - " File \"/Users/aaronli/miniconda3/envs/test-sdk2/lib/python3.10/site-packages/swan/api/swan_api.py\", line 184, in get_real_url\n", - " jobs = deployment_info['data']['jobs']\n", - "TypeError: 'NoneType' object is not subscriptable\n", - "\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "None\n" - ] - } - ], - "source": [ - "r = swan_api.get_real_url(task_uuid)\n", - "print(r)" - ] - }, - { - "cell_type": "code", - "execution_count": 12, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "{\n", - " \"Hello\": \"World! Today - 06.21\"\n", - "}\n" - ] - } - ], - "source": [ - "import requests\n", - "import json\n", - "\n", - "headers = {\n", - " 'Content-Type': 'application/json',\n", - "}\n", - "\n", - "response = requests.get(r[0], headers=headers)\n", - "\n", - "try:\n", - " print(json.dumps(response.json(), indent=4))\n", - "except Exception as e:\n", - " print(e)\n", - " print(response)\n" - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "swanchain", - "language": "python", - "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.10.14" - } - }, - "nbformat": 4, - "nbformat_minor": 2 -} diff --git a/swan/api_client.py b/swan/api_client.py index 34370e7f..1662a6ba 100644 --- a/swan/api_client.py +++ b/swan/api_client.py @@ -46,8 +46,8 @@ def _request(self, method, request_path, swan_api, params, token, files=False, j return response.json() - def _request_stream_upload(self, request_path, swwan_api, params, token): - url = swwan_api + request_path + def _request_stream_upload(self, request_path, swan_api, params, token): + url = swan_api + request_path header = {} if token: header["Authorization"] = "Bearer " + token @@ -78,8 +78,8 @@ def _request_stream_upload(self, request_path, swwan_api, params, token): return response.json() - def _request_bucket_upload(self, request_path, swwan_api, params, token): - url = swwan_api + request_path + def _request_bucket_upload(self, request_path, swan_api, params, token): + url = swan_api + request_path header = {} if token: header["Authorization"] = "Bearer " + token From e56bb61852e0fa8516c4292724f0240f5e4fb549 Mon Sep 17 00:00:00 2001 From: alphaflows Date: Fri, 3 May 2024 16:09:22 -0400 Subject: [PATCH 04/10] update demo --- examples/example-demo-v0.0.2.ipynb | 94 +++++++++++++++++------------- 1 file changed, 54 insertions(+), 40 deletions(-) diff --git a/examples/example-demo-v0.0.2.ipynb b/examples/example-demo-v0.0.2.ipynb index 21b0af53..f88be9b5 100644 --- a/examples/example-demo-v0.0.2.ipynb +++ b/examples/example-demo-v0.0.2.ipynb @@ -4,7 +4,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Demo for SDK (latest)\n", + "## Demo for SDK (v0.0.2)\n", "\n", "This example shows how to use SDK to deploy a task. The demo notebook includes the following steps:\n", "- [initialization](#initialization)\n", @@ -33,7 +33,7 @@ "\n", "If use this repository to test on your local machine, add `sys.path.insert(0, '..')` at the beginning, and run code in the root directory of this repository.\n", "\n", - "You need to add environment file `.env` in your local directory, including the following parameters (`PK` is private key):\n", + "To use this SDK, you need to add environment file `.env` in your local directory, including the following parameters (`PK` is private key):\n", "\n", "```\n", "API_KEY=\n", @@ -44,7 +44,7 @@ }, { "cell_type": "code", - "execution_count": 5, + "execution_count": 1, "metadata": {}, "outputs": [], "source": [ @@ -74,7 +74,7 @@ }, { "cell_type": "code", - "execution_count": 6, + "execution_count": 2, "metadata": {}, "outputs": [ { @@ -104,7 +104,7 @@ }, { "cell_type": "code", - "execution_count": 7, + "execution_count": 3, "metadata": {}, "outputs": [ { @@ -222,14 +222,14 @@ " 'name': 'Hpc1ae.2xlarge',\n", " 'description': 'NVIDIA A4000 · 4 vCPU · 8 GiB',\n", " 'type': 'AI GPU',\n", - " 'reigion': ['North Carolina-US', 'North Rhine-Westphalia-DE'],\n", + " 'reigion': ['North Carolina-US'],\n", " 'price': '21.0',\n", " 'status': 'available'},\n", " {'id': 25,\n", " 'name': 'Hpc1ae.3xlarge',\n", " 'description': 'NVIDIA A4000 · 8 vCPU · 16 GiB',\n", " 'type': 'AI GPU',\n", - " 'reigion': ['North Carolina-US', 'North Rhine-Westphalia-DE'],\n", + " 'reigion': ['North Carolina-US'],\n", " 'price': '21.0',\n", " 'status': 'available'},\n", " {'id': 27,\n", @@ -281,6 +281,20 @@ " 'reigion': ['North Carolina-US'],\n", " 'price': '75.0',\n", " 'status': 'available'},\n", + " {'id': 63,\n", + " 'name': 'Hpc2ad.medium',\n", + " 'description': 'Nvidia 4070 · 4 vCPU · 8 GiB',\n", + " 'type': 'AI GPU',\n", + " 'reigion': ['North Rhine-Westphalia-DE'],\n", + " 'price': '18.0',\n", + " 'status': 'available'},\n", + " {'id': 68,\n", + " 'name': 'Hpc2az.medium',\n", + " 'description': 'Nvidia 4070 · 8 vCPU · 16 GiB',\n", + " 'type': 'AI GPU',\n", + " 'reigion': ['North Rhine-Westphalia-DE'],\n", + " 'price': '23.0',\n", + " 'status': 'available'},\n", " {'id': 72,\n", " 'name': 'R1ae.small',\n", " 'description': 'Nvidia 2080 TI · 8 vCPU · 32 GiB',\n", @@ -327,7 +341,7 @@ " 'status': 'available'}]" ] }, - "execution_count": 7, + "execution_count": 3, "metadata": {}, "output_type": "execute_result" } @@ -351,7 +365,7 @@ }, { "cell_type": "code", - "execution_count": 8, + "execution_count": 4, "metadata": {}, "outputs": [ { @@ -374,7 +388,7 @@ }, { "cell_type": "code", - "execution_count": 9, + "execution_count": 5, "metadata": {}, "outputs": [ { @@ -383,7 +397,7 @@ "1" ] }, - "execution_count": 9, + "execution_count": 5, "metadata": {}, "output_type": "execute_result" } @@ -403,7 +417,7 @@ }, { "cell_type": "code", - "execution_count": 10, + "execution_count": 6, "metadata": {}, "outputs": [], "source": [ @@ -417,7 +431,7 @@ }, { "cell_type": "code", - "execution_count": 11, + "execution_count": 7, "metadata": {}, "outputs": [], "source": [ @@ -427,16 +441,16 @@ }, { "cell_type": "code", - "execution_count": 12, + "execution_count": 8, "metadata": {}, "outputs": [ { "data": { "text/plain": [ - "'https://data.mcs.lagrangedao.org/ipfs/QmRGyt2gSvUkM8UePz5aeTW2tmw2MQrS4oXPVWLwHufqyN'" + "'https://data.mcs.lagrangedao.org/ipfs/QmNbszwFUG7ZCsi3NjdEDME8WzX6LL3GHVmMf8KDmyUrYG'" ] }, - "execution_count": 12, + "execution_count": 8, "metadata": {}, "output_type": "execute_result" } @@ -454,7 +468,7 @@ }, { "cell_type": "code", - "execution_count": 13, + "execution_count": 9, "metadata": {}, "outputs": [ { @@ -482,7 +496,7 @@ }, { "cell_type": "code", - "execution_count": 14, + "execution_count": 10, "metadata": {}, "outputs": [ { @@ -492,15 +506,15 @@ "{\n", " \"data\": {\n", " \"task\": {\n", - " \"created_at\": \"1714763824\",\n", - " \"end_at\": \"1714767416\",\n", + " \"created_at\": \"1714766359\",\n", + " \"end_at\": \"1714769950\",\n", " \"leading_job_id\": null,\n", " \"refund_amount\": null,\n", " \"status\": \"initialized\",\n", - " \"task_detail_cid\": \"https://data.mcs.lagrangedao.org/ipfs/QmWkquL27Tss4GWGs5nzPnijV1gZmF5FSxA8znTQmvmn3B\",\n", + " \"task_detail_cid\": \"https://data.mcs.lagrangedao.org/ipfs/QmVkif9jcuPXnoFjcVx1V5AKAVBVdA1VDMUMRkKbBEPR5D\",\n", " \"tx_hash\": null,\n", - " \"updated_at\": \"1714763824\",\n", - " \"uuid\": \"8d19975e-cbfc-4423-a580-2bd1ba7aac1e\"\n", + " \"updated_at\": \"1714766359\",\n", + " \"uuid\": \"d2ee1a94-f96f-4fb3-8a64-65d26314e000\"\n", " }\n", " },\n", " \"message\": \"Task_uuid initialized.\",\n", @@ -517,7 +531,7 @@ " region='North Carolina-US', \n", " start_in=300, \n", " duration=duration, \n", - " job_source_uri=job_source_uri,#repo.source_uri, \n", + " job_source_uri=job_source_uri,\n", " paid=contract._wei_to_swan(amount),\n", " wallet_address=os.getenv('WALLET'),\n", ")\n", @@ -536,14 +550,14 @@ }, { "cell_type": "code", - "execution_count": 15, + "execution_count": 11, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ - "0x27205cc0eeb4a2283af518c9aa5e05f436b829b0dc51ec40d08652168651b9d7\n" + "0x68b940951a5e9d7ccb8674b4342fab0de942a3aa64d5395d74f874f48cb7a58d\n" ] } ], @@ -563,26 +577,26 @@ }, { "cell_type": "code", - "execution_count": 16, + "execution_count": 12, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ - "{'tx_hash': '0x27205cc0eeb4a2283af518c9aa5e05f436b829b0dc51ec40d08652168651b9d7', 'task_uuid': '8d19975e-cbfc-4423-a580-2bd1ba7aac1e'}\n", + "{'tx_hash': '0x68b940951a5e9d7ccb8674b4342fab0de942a3aa64d5395d74f874f48cb7a58d', 'task_uuid': 'd2ee1a94-f96f-4fb3-8a64-65d26314e000'}\n", "{\n", " \"data\": {\n", " \"task\": {\n", - " \"created_at\": \"1714763824\",\n", - " \"end_at\": \"1714767416\",\n", + " \"created_at\": \"1714766359\",\n", + " \"end_at\": \"1714769950\",\n", " \"leading_job_id\": null,\n", " \"refund_amount\": null,\n", " \"status\": \"created\",\n", - " \"task_detail_cid\": \"https://data.mcs.lagrangedao.org/ipfs/QmWkquL27Tss4GWGs5nzPnijV1gZmF5FSxA8znTQmvmn3B\",\n", + " \"task_detail_cid\": \"https://data.mcs.lagrangedao.org/ipfs/QmVkif9jcuPXnoFjcVx1V5AKAVBVdA1VDMUMRkKbBEPR5D\",\n", " \"tx_hash\": null,\n", - " \"updated_at\": \"1714763841\",\n", - " \"uuid\": \"8d19975e-cbfc-4423-a580-2bd1ba7aac1e\"\n", + " \"updated_at\": \"1714766390\",\n", + " \"uuid\": \"d2ee1a94-f96f-4fb3-8a64-65d26314e000\"\n", " }\n", " },\n", " \"message\": \"Task payment validated successfully.\",\n", @@ -611,7 +625,7 @@ }, { "cell_type": "code", - "execution_count": 17, + "execution_count": 14, "metadata": {}, "outputs": [ { @@ -623,18 +637,18 @@ " \"computing_providers\": [],\n", " \"jobs\": [],\n", " \"task\": {\n", - " \"created_at\": \"1714763824\",\n", - " \"end_at\": \"1714767416\",\n", + " \"created_at\": \"1714766359\",\n", + " \"end_at\": \"1714769950\",\n", " \"leading_job_id\": null,\n", " \"refund_amount\": null,\n", " \"status\": \"accepting_bids\",\n", - " \"task_detail_cid\": \"https://data.mcs.lagrangedao.org/ipfs/QmWkquL27Tss4GWGs5nzPnijV1gZmF5FSxA8znTQmvmn3B\",\n", + " \"task_detail_cid\": \"https://data.mcs.lagrangedao.org/ipfs/QmVkif9jcuPXnoFjcVx1V5AKAVBVdA1VDMUMRkKbBEPR5D\",\n", " \"tx_hash\": null,\n", - " \"updated_at\": \"1714763861\",\n", - " \"uuid\": \"8d19975e-cbfc-4423-a580-2bd1ba7aac1e\"\n", + " \"updated_at\": \"1714766411\",\n", + " \"uuid\": \"d2ee1a94-f96f-4fb3-8a64-65d26314e000\"\n", " }\n", " },\n", - " \"message\": \"fetch task info for task_uuid='8d19975e-cbfc-4423-a580-2bd1ba7aac1e' successfully\",\n", + " \"message\": \"fetch task info for task_uuid='d2ee1a94-f96f-4fb3-8a64-65d26314e000' successfully\",\n", " \"status\": \"success\"\n", "}\n" ] From 9971bc863f77e28fe224bde5520d775dd782f89b Mon Sep 17 00:00:00 2001 From: alphaflows Date: Fri, 3 May 2024 16:21:38 -0400 Subject: [PATCH 05/10] update demo --- examples/example-demo-v0.0.2.ipynb | 94 +++++++++++++----------------- 1 file changed, 41 insertions(+), 53 deletions(-) diff --git a/examples/example-demo-v0.0.2.ipynb b/examples/example-demo-v0.0.2.ipynb index f88be9b5..7d38566b 100644 --- a/examples/example-demo-v0.0.2.ipynb +++ b/examples/example-demo-v0.0.2.ipynb @@ -44,7 +44,7 @@ }, { "cell_type": "code", - "execution_count": 1, + "execution_count": 15, "metadata": {}, "outputs": [], "source": [ @@ -74,7 +74,7 @@ }, { "cell_type": "code", - "execution_count": 2, + "execution_count": 16, "metadata": {}, "outputs": [ { @@ -104,7 +104,7 @@ }, { "cell_type": "code", - "execution_count": 3, + "execution_count": 17, "metadata": {}, "outputs": [ { @@ -341,7 +341,7 @@ " 'status': 'available'}]" ] }, - "execution_count": 3, + "execution_count": 17, "metadata": {}, "output_type": "execute_result" } @@ -365,20 +365,20 @@ }, { "cell_type": "code", - "execution_count": 4, + "execution_count": 18, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ - "hardware.name='C1ae.medium', hardware.id=1, ['North Carolina-US', 'Bashkortostan Republic-RU', 'Kyiv City-UA', 'Kowloon City-HK', 'Tokyo-JP', 'California-US', 'Central and Western District-HK', 'Quebec-CA', 'North West-SG', 'Kwai Tsing-HK', 'Bavaria-DE', 'Guangdong-CN', 'Kowloon-HK', 'North Rhine-Westphalia-DE']\n", - "The chosen hardware_id=1\n" + "hardware.name='C1ae.small', hardware.id=0, ['North Carolina-US', 'Bashkortostan Republic-RU', 'Kyiv City-UA', 'Kowloon City-HK', 'Tokyo-JP', 'California-US', 'Central and Western District-HK', 'Quebec-CA', 'North West-SG', 'Kwai Tsing-HK', 'Bavaria-DE', 'Saxony-DE', 'Guangdong-CN', 'Kowloon-HK', 'North Rhine-Westphalia-DE']\n", + "The chosen hardware_id=0\n" ] } ], "source": [ - "cfg_name = 'C1ae.medium' #\"G1ae.medium\"\n", + "cfg_name = 'C1ae.small' #\"G1ae.medium\"\n", "hardware = [hardware for hardware in hardwares if hardware.name == cfg_name][0]\n", "print(f\"{hardware.name=}, {hardware.id=}, {hardware.region}\")\n", "\n", @@ -388,16 +388,16 @@ }, { "cell_type": "code", - "execution_count": 5, + "execution_count": 19, "metadata": {}, "outputs": [ { "data": { "text/plain": [ - "1" + "0" ] }, - "execution_count": 5, + "execution_count": 19, "metadata": {}, "output_type": "execute_result" } @@ -417,7 +417,7 @@ }, { "cell_type": "code", - "execution_count": 6, + "execution_count": 20, "metadata": {}, "outputs": [], "source": [ @@ -431,7 +431,7 @@ }, { "cell_type": "code", - "execution_count": 7, + "execution_count": 21, "metadata": {}, "outputs": [], "source": [ @@ -441,16 +441,16 @@ }, { "cell_type": "code", - "execution_count": 8, + "execution_count": 22, "metadata": {}, "outputs": [ { "data": { "text/plain": [ - "'https://data.mcs.lagrangedao.org/ipfs/QmNbszwFUG7ZCsi3NjdEDME8WzX6LL3GHVmMf8KDmyUrYG'" + "'https://data.mcs.lagrangedao.org/ipfs/QmWPaTd91qMm9qoAaXahe3r5Q18D8BTeHK27KUZHZwRcUp'" ] }, - "execution_count": 8, + "execution_count": 22, "metadata": {}, "output_type": "execute_result" } @@ -468,14 +468,14 @@ }, { "cell_type": "code", - "execution_count": 9, + "execution_count": 23, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ - "1000000000000000000\n" + "0\n" ] } ], @@ -496,7 +496,7 @@ }, { "cell_type": "code", - "execution_count": 10, + "execution_count": 24, "metadata": {}, "outputs": [ { @@ -506,15 +506,15 @@ "{\n", " \"data\": {\n", " \"task\": {\n", - " \"created_at\": \"1714766359\",\n", - " \"end_at\": \"1714769950\",\n", + " \"created_at\": \"1714767107\",\n", + " \"end_at\": \"1714770698\",\n", " \"leading_job_id\": null,\n", " \"refund_amount\": null,\n", " \"status\": \"initialized\",\n", - " \"task_detail_cid\": \"https://data.mcs.lagrangedao.org/ipfs/QmVkif9jcuPXnoFjcVx1V5AKAVBVdA1VDMUMRkKbBEPR5D\",\n", + " \"task_detail_cid\": \"https://data.mcs.lagrangedao.org/ipfs/QmPexyCMmdB2Yvbh5eLAvC8vho6B9TLttbSbr6xLQGNRoG\",\n", " \"tx_hash\": null,\n", - " \"updated_at\": \"1714766359\",\n", - " \"uuid\": \"d2ee1a94-f96f-4fb3-8a64-65d26314e000\"\n", + " \"updated_at\": \"1714767107\",\n", + " \"uuid\": \"0caaf1b1-06b0-448e-a2c8-7f61fa5754fc\"\n", " }\n", " },\n", " \"message\": \"Task_uuid initialized.\",\n", @@ -550,14 +550,14 @@ }, { "cell_type": "code", - "execution_count": 11, + "execution_count": 25, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ - "0x68b940951a5e9d7ccb8674b4342fab0de942a3aa64d5395d74f874f48cb7a58d\n" + "0xbd4dde0778d4c11e7f68fac8a58b71609a4633c299c657793d7e5e56eb302085\n" ] } ], @@ -577,26 +577,26 @@ }, { "cell_type": "code", - "execution_count": 12, + "execution_count": 26, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ - "{'tx_hash': '0x68b940951a5e9d7ccb8674b4342fab0de942a3aa64d5395d74f874f48cb7a58d', 'task_uuid': 'd2ee1a94-f96f-4fb3-8a64-65d26314e000'}\n", + "{'tx_hash': '0xbd4dde0778d4c11e7f68fac8a58b71609a4633c299c657793d7e5e56eb302085', 'task_uuid': '0caaf1b1-06b0-448e-a2c8-7f61fa5754fc'}\n", "{\n", " \"data\": {\n", " \"task\": {\n", - " \"created_at\": \"1714766359\",\n", - " \"end_at\": \"1714769950\",\n", + " \"created_at\": \"1714767107\",\n", + " \"end_at\": \"1714770698\",\n", " \"leading_job_id\": null,\n", " \"refund_amount\": null,\n", " \"status\": \"created\",\n", - " \"task_detail_cid\": \"https://data.mcs.lagrangedao.org/ipfs/QmVkif9jcuPXnoFjcVx1V5AKAVBVdA1VDMUMRkKbBEPR5D\",\n", + " \"task_detail_cid\": \"https://data.mcs.lagrangedao.org/ipfs/QmPexyCMmdB2Yvbh5eLAvC8vho6B9TLttbSbr6xLQGNRoG\",\n", " \"tx_hash\": null,\n", - " \"updated_at\": \"1714766390\",\n", - " \"uuid\": \"d2ee1a94-f96f-4fb3-8a64-65d26314e000\"\n", + " \"updated_at\": \"1714767127\",\n", + " \"uuid\": \"0caaf1b1-06b0-448e-a2c8-7f61fa5754fc\"\n", " }\n", " },\n", " \"message\": \"Task payment validated successfully.\",\n", @@ -625,7 +625,7 @@ }, { "cell_type": "code", - "execution_count": 14, + "execution_count": 28, "metadata": {}, "outputs": [ { @@ -637,18 +637,18 @@ " \"computing_providers\": [],\n", " \"jobs\": [],\n", " \"task\": {\n", - " \"created_at\": \"1714766359\",\n", - " \"end_at\": \"1714769950\",\n", + " \"created_at\": \"1714767107\",\n", + " \"end_at\": \"1714770698\",\n", " \"leading_job_id\": null,\n", " \"refund_amount\": null,\n", " \"status\": \"accepting_bids\",\n", - " \"task_detail_cid\": \"https://data.mcs.lagrangedao.org/ipfs/QmVkif9jcuPXnoFjcVx1V5AKAVBVdA1VDMUMRkKbBEPR5D\",\n", + " \"task_detail_cid\": \"https://data.mcs.lagrangedao.org/ipfs/QmPexyCMmdB2Yvbh5eLAvC8vho6B9TLttbSbr6xLQGNRoG\",\n", " \"tx_hash\": null,\n", - " \"updated_at\": \"1714766411\",\n", - " \"uuid\": \"d2ee1a94-f96f-4fb3-8a64-65d26314e000\"\n", + " \"updated_at\": \"1714767131\",\n", + " \"uuid\": \"0caaf1b1-06b0-448e-a2c8-7f61fa5754fc\"\n", " }\n", " },\n", - " \"message\": \"fetch task info for task_uuid='d2ee1a94-f96f-4fb3-8a64-65d26314e000' successfully\",\n", + " \"message\": \"fetch task info for task_uuid='0caaf1b1-06b0-448e-a2c8-7f61fa5754fc' successfully\",\n", " \"status\": \"success\"\n", "}\n" ] @@ -692,19 +692,7 @@ "cell_type": "code", "execution_count": null, "metadata": {}, - "outputs": [ - { - "ename": "IndexError", - "evalue": "list index out of range", - "output_type": "error", - "traceback": [ - "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m", - "\u001b[0;31mIndexError\u001b[0m Traceback (most recent call last)", - "Cell \u001b[0;32mIn[52], line 8\u001b[0m\n\u001b[1;32m 2\u001b[0m \u001b[38;5;28;01mimport\u001b[39;00m \u001b[38;5;21;01mjson\u001b[39;00m\n\u001b[1;32m 4\u001b[0m headers \u001b[38;5;241m=\u001b[39m {\n\u001b[1;32m 5\u001b[0m \u001b[38;5;124m'\u001b[39m\u001b[38;5;124mContent-Type\u001b[39m\u001b[38;5;124m'\u001b[39m: \u001b[38;5;124m'\u001b[39m\u001b[38;5;124mapplication/json\u001b[39m\u001b[38;5;124m'\u001b[39m,\n\u001b[1;32m 6\u001b[0m }\n\u001b[0;32m----> 8\u001b[0m response \u001b[38;5;241m=\u001b[39m requests\u001b[38;5;241m.\u001b[39mget(\u001b[43mr\u001b[49m\u001b[43m[\u001b[49m\u001b[38;5;241;43m0\u001b[39;49m\u001b[43m]\u001b[49m, headers\u001b[38;5;241m=\u001b[39mheaders)\n\u001b[1;32m 10\u001b[0m \u001b[38;5;28;01mtry\u001b[39;00m:\n\u001b[1;32m 11\u001b[0m \u001b[38;5;28mprint\u001b[39m(json\u001b[38;5;241m.\u001b[39mdumps(response\u001b[38;5;241m.\u001b[39mjson(), indent\u001b[38;5;241m=\u001b[39m\u001b[38;5;241m4\u001b[39m))\n", - "\u001b[0;31mIndexError\u001b[0m: list index out of range" - ] - } - ], + "outputs": [], "source": [ "import requests\n", "import json\n", From ed75e26dd9cdb237359f222ca5f6ef2983d4a1d2 Mon Sep 17 00:00:00 2001 From: Zihang Chen <159826530+ZihangChenNBAI@users.noreply.github.com> Date: Fri, 3 May 2024 16:33:44 -0400 Subject: [PATCH 06/10] updated documentation --- README.md | 17 +++- docs/api_reference.md | 0 docs/configuration.md | 37 +++++++++ docs/installation.md | 27 ++++++ docs/sample_tutorial.md | 180 ++++++++++++++++++++++++++++++++++++++++ docs/usage.md | 3 - 6 files changed, 258 insertions(+), 6 deletions(-) delete mode 100644 docs/api_reference.md create mode 100644 docs/configuration.md create mode 100644 docs/sample_tutorial.md delete mode 100644 docs/usage.md diff --git a/README.md b/README.md index 792b6f25..27ead482 100644 --- a/README.md +++ b/README.md @@ -7,6 +7,7 @@ - [Overview](#overview) - [Features](#features) - [Installation](#installation) +- [Use Python dotenv (Optional)](#use-python-dotenv) - [Quick Guide](#quick-start-guide-sdk-v2) 1. [Get SwanHub API Key](#1-get-swanhub-api-key) 2. [Login to SwanHub](#2-login-into-swanhub-through-sdk) @@ -18,6 +19,7 @@ 8. [Submit Payment](#8-submit-payment) 9. [Validate Payment and Delpoy Task](#9-validate-payment-to-deploy-task) 10. [Follow Up Deployed Task Status (Optional)](#10-follow-up-task-status-optional) +- [Executale Example](#examples) - [Documentation](#documentation) - [Contribution](#contributions) - [License](#license) @@ -31,12 +33,14 @@ The PYTHON SWAN SDK is a comprehensive toolkit designed to facilitate seamless i - **API Client Integration**: Streamline your development workflow with our intuitive API client. - **Pre-defined Data Models**: Utilize our structured data models for tasks, directories, and source URIs to enhance your application's reliability and scalability. - **Service Layer Abstractions**: Access complex functionalities through a simplified high-level interface, improving code maintainability. -- **Extensive Documentation**: Access a wealth of information through our comprehensive guides and reference materials located in the `docs/` directory. +- **Extensive Documentation**: Access a wealth of information through our comprehensive guides and reference materials located in the `docs/` directory on Github. ## Installation Setting up the PYTHON SWAN SDK is straightforward. +To use Python Swan SDK, use Python 3.8 or later. Earlier versions are not supported. + **Install via PyPI testnet:** ```bash @@ -49,6 +53,12 @@ pip install swan-sdk git clone https://github.com/swanchain/orchestrator-sdk.git ``` +## Use Python dotenv +It is recommanded to store your important person information in configuration or as environmental variables. Python dotenv allows loading environment variable from `.env` files for easier access and better security. + +python-dotenv package: https://pypi.org/project/python-dotenv/ +Detailed instructions: https://github.com/swanchain/python-swan-sdk/tree/dev/docs/configuration.md + ## Quick Start Guide for Swan SDK Jump into using the SDK with this quick example: @@ -218,11 +228,12 @@ r = swan_api.get_real_url(task_uuid) print(r) ``` -For additional detailed examples, visit the `docs/` directory. +## Examples +For executable examples consult https://github.com/swanchain/python-swan-sdk/tree/dev/examples ## Documentation -For comprehensive documentation, including detailed installation guides, usage examples, and complete API references, please consult the `docs/` directory. +For comprehensive documentation, including detailed installation guides, usage examples, and complete API references, please consult https://github.com/swanchain/python-swan-sdk/tree/dev/docs ## Contributions diff --git a/docs/api_reference.md b/docs/api_reference.md deleted file mode 100644 index e69de29b..00000000 diff --git a/docs/configuration.md b/docs/configuration.md new file mode 100644 index 00000000..5a9656a4 --- /dev/null +++ b/docs/configuration.md @@ -0,0 +1,37 @@ +# Swan SDK Configuration + +## Table Of Contents +- [Introduction](#introduction) +- [Use Python dotenv](#use-python-dotenv) + +## Introduction +Swan SDK requires private information such SwanHub API Key, Wallet Address and Private Key. To safely use Swan SDK avoid putting important informance in code. The recommanded way to use private informaiton is to store as environment variables. + +## Use Python dotenv +python-dotenv allow user to write environment variables into `.env` and loaded when needed. + +To download python-dotenv: +```bash +pip install python-dotenv +``` + +Store personal information into .env file: +```bash +vim .env +``` + +Sample .env file: +``` +API_KEY = "12324" +WALLET_ADR = '0x12324' +PRIVATE_KEY = '23123jkk12jh3k12jk' +``` + +To use dotenv in code: +```python +from dotenv import load_dotenv +load_dotenv() + +import os +api_key = os.getenv('API_KEY') +``` \ No newline at end of file diff --git a/docs/installation.md b/docs/installation.md index e69de29b..ba72323d 100644 --- a/docs/installation.md +++ b/docs/installation.md @@ -0,0 +1,27 @@ +# Install Python Swan SDK + +## Table Of Contents +- [Create Virtual Env](#create-virtual-environment) +- [Install from PyPI](#install-python-swan-sdk) +- [Clone from GitHub](#clone-from-github) + + +## Create Virtual Environment + +MacOS and Linux +```bash +python3 -m venv .venv +source .venv/bin/activate +``` + +## Install via PyPI +```bash +pip install swan-sdk +``` + +## Clone from Github +```bash +git clone https://github.com/swanchain/orchestrator-sdk.git +git checkout dev +``` + diff --git a/docs/sample_tutorial.md b/docs/sample_tutorial.md new file mode 100644 index 00000000..435b3bd8 --- /dev/null +++ b/docs/sample_tutorial.md @@ -0,0 +1,180 @@ +# Sample Tutorial for Swan SDK +Jump into using the SDK with this quick example: + +## Table Of Contents +1. [Get SwanHub API Key](#1-get-swanhub-api-key) +2. [Login to SwanHub](#2-login-into-swanhub-through-sdk) +3. [Use Swan Payment Contract](#3-connect-to-swan-payment-contract) +4. [Retrieve CP Hardware Info](#4-retrieve-avaliable-hardware-informaitons) +5. [Get Job Source URI](#5-get-job_source_uri) +6. [Esitmate Task Payment](#6-esitmate-payment-amount) +7. [Create Task](#7-create-task) +8. [Submit Payment](#8-submit-payment) +9. [Validate Payment and Delpoy Task](#9-validate-payment-to-deploy-task) +10. [Follow Up Deployed Task Status (Optional)](#10-follow-up-task-status-optional) + +### 1. Get SwanHub API Key + +To use `swan-sdk` SwanHub API key is required. +- Go to Swan Dashboard: https://orchestrator.swanchain.io/provider-status +- Login through MetaMask. +- Click the user icon on top right. +- Click 'Show API-Key' -> 'New API Key' +- Store your API Key safely, do not share with others. + +### 2. Login into SwanHub Through SDK + +To use `swan-sdk` you will need to login to SwanHub using API Key. (Wallet login is not supported) + +```python +from swan import SwanAPI + +swan_api = SwanAPI(api_key="") +``` + +### 3. Connect to Swan Payment Contract + +Payment of SwanHub deployment is paid through Swan Payment Contract. To navigate the contract ABIs. First create a `SwanContract()` instance: +```python +from swan.contract.swan_contract import SwanContract + +contract = SwanContract('', swan_api.contract_info) +``` + +### 4. Retrieve Avaliable Hardware Informaitons + +SwanHub provides selection of Computing Providers with different hardwares. +Use `SwanAPI().get_hardware_config()` to retrieve all avaliable hardwares on SwanHub. + +Each hardware is stored in `HardwareConfig()` object. +```python +from swan.object import HardwareConfig +``` + +Hardware config contains an unique hardware ID, hardware name, description, hardware type (CPU/GPU), price per hour, avaliable region and current status. + +See all avaliable hardware in a python dictionary: +```python + +hardwares = swan_api.get_hardware_config() +hardwares_info = [hardware.to_dict() for hardware in hardwares if hardware.status == "available"] +hardwares_info +``` +`HardwareConfig().status` shows the avalibility of the hardware. +`HardwareConfig().region` is a list of all region this hardware is avaliable in. + +Retrieve the hardware with hardware id 0: +```python +hardwares = swan_api.get_hardware_config() +chosen_hardware = [hardware for hardware in hardwares if hardware.id == 0] +chosen_hardware.to_dict() +``` + +Sample output: +``` +{'id': 0, + 'name': 'C1ae.small', + 'description': 'CPU only · 2 vCPU · 2 GiB', + 'type': 'CPU', + 'reigion': ['North Carolina-US', ...], + 'price': '0.0', + 'status': 'available' +} +``` + +### 5. Get job_source_uri + +`job_source_uri` can be create through `SwanAPI().get_source_uri()` API. + +Generate a source URI +A demo tetris docker image on GitHub as repo_uri: 'https://github.com/alphaflows/tetris-docker-image.git' +```python +job_source_uri = swan_api.get_source_uri( + repo_uri='', + hardware_id=chosen_hardware.id, + wallet_address='' +) + +job_source_uri = job_source_uri['data']['job_source_uri'] +``` + +### 6. Esitmate Payment Amount +To estimate the payment required for the deployment. Use `SwanContract().estiamte_payment()` +```python +duration_hour = 1 # or duration you want the deployment to run +amount = contract.estimate_payment(chosen_hardware.id, duration_hour) +amount # amount is in wei, 18 decimals +``` + +### 7. Create Task + +Before paying for the task. First create a task on SwanHub using desired task attributes. +```python +import json + +duration = 3600*duration_hour +cfg_name = chosen_hardware.name + +result = swan_api.create_task( + cfg_name=cfg_name, + region='', + start_in=300, # in seconds + duration=duration, + job_source_uri=job_source_uri, #repo.source_uri + paid=contract._wei_to_swan(amount), # from wei to swan amount/1e18 + wallet_address='', +) +task_uuid = result['data']['task']['uuid'] + +print(json.dumps(result, indent=2)) # Print response +``` + +Sample output: +``` +{ + "data": { + "task": { + "created_at": "1714254304", + "end_at": "1714257898", + "leading_job_id": null, + "refund_amount": null, + "status": "initialized", + "task_detail_cid": "https://data.mcs.lagrangedao.org/ipfs/QmXLSaBqtoWZWAUoiYxM3EDxh14kkhpUiYkVjZSK3BhfKj", + "tx_hash": null, + "updated_at": "1714254304", + "uuid": "f4799212-4dc2-4c0b-9209-c0ac7bc48442" + } + }, + "message": "Task_uuid initialized.", + "status": "success" +} +``` + +The `task['uuid']` will be used in following operations. + +### 8. Submit Payment + +Use `SwanContract().submit_payment()` to pay for the task. The TX hash is the receipt for the payment. +```python +tx_hash = contract.submit_payment(task_uuid, hardware_id, duration) +``` + +### 9. Validate Payment to Deploy Task + +Use `SwanAPI().validate_payment()` to validate the payment using TX hash and deploy the task. +```python +swan_api.validate_payment( + tx_hash=tx_hash, + task_uuid=task_uuid +) +``` + +### 10. Follow up Task Status (Optional) + +#### Show results + +Get the deploy URI to test your task deployment using `SwanAPI().get_real_uri()`. +```python +r = swan_api.get_real_url(task_uuid) +print(r) +``` \ No newline at end of file diff --git a/docs/usage.md b/docs/usage.md deleted file mode 100644 index 5b99a679..00000000 --- a/docs/usage.md +++ /dev/null @@ -1,3 +0,0 @@ -# Using Swan SDK V1 APIs - -## Table Of Contents \ No newline at end of file From 5e0d7cbf4ac226daf90cddbcf51137969aca2301 Mon Sep 17 00:00:00 2001 From: Zihang Chen <159826530+ZihangChenNBAI@users.noreply.github.com> Date: Fri, 3 May 2024 16:44:07 -0400 Subject: [PATCH 07/10] update documentation for release --- README.md | 17 ++++++++--------- 1 file changed, 8 insertions(+), 9 deletions(-) diff --git a/README.md b/README.md index 27ead482..fe5276d9 100644 --- a/README.md +++ b/README.md @@ -28,6 +28,8 @@ The PYTHON SWAN SDK is a comprehensive toolkit designed to facilitate seamless interactions with the SwanChain API. Tailored for developers, this SDK simplifies the creation and management of computational tasks (CP tasks), making it an indispensable tool for developers working in various tech domains. +GitHub Link: https://github.com/swanchain/python-swan-sdk/tree/release/v0.0.2 + ## Features - **API Client Integration**: Streamline your development workflow with our intuitive API client. @@ -44,20 +46,21 @@ To use Python Swan SDK, use Python 3.8 or later. Earlier versions are not suppor **Install via PyPI testnet:** ```bash -pip install swan-sdk +pip install swan-sdk==0.0.2 ``` **Clone from GitHub:** ```bash git clone https://github.com/swanchain/orchestrator-sdk.git +git checkout release/v0.0.2 ``` ## Use Python dotenv It is recommanded to store your important person information in configuration or as environmental variables. Python dotenv allows loading environment variable from `.env` files for easier access and better security. python-dotenv package: https://pypi.org/project/python-dotenv/ -Detailed instructions: https://github.com/swanchain/python-swan-sdk/tree/dev/docs/configuration.md +Detailed instructions: https://github.com/swanchain/python-swan-sdk/tree/release/v0.0.2/docs/configuration.md ## Quick Start Guide for Swan SDK Jump into using the SDK with this quick example: @@ -229,16 +232,12 @@ print(r) ``` ## Examples -For executable examples consult https://github.com/swanchain/python-swan-sdk/tree/dev/examples +For executable examples consult https://github.com/swanchain/python-swan-sdk/tree/release/v0.0.2/examples ## Documentation -For comprehensive documentation, including detailed installation guides, usage examples, and complete API references, please consult https://github.com/swanchain/python-swan-sdk/tree/dev/docs - -## Contributions - -We welcome and encourage community contributions! Please refer to our **CONTRIBUTING.md** for guidelines on how to contribute effectively. +For comprehensive documentation, including detailed installation guides, usage examples, and complete API references, please consult https://github.com/swanchain/python-swan-sdk/tree/release/v0.0.2/docs ## License -The PYTHON SWAN SDK is released under the **MIT-FilSwan** license, details of which can be found in the LICENSE file. +The PYTHON SWAN SDK is released under the **MIT** license, details of which can be found in the LICENSE file. From 49ab0ce21e535f9c1387bebc5c5652931a273755 Mon Sep 17 00:00:00 2001 From: Zihang Chen <159826530+ZihangChenNBAI@users.noreply.github.com> Date: Fri, 3 May 2024 16:58:23 -0400 Subject: [PATCH 08/10] fix indentation --- README.md | 2 +- setup.py | 8 ++++---- 2 files changed, 5 insertions(+), 5 deletions(-) diff --git a/README.md b/README.md index fe5276d9..db1d1fb1 100644 --- a/README.md +++ b/README.md @@ -59,7 +59,7 @@ git checkout release/v0.0.2 ## Use Python dotenv It is recommanded to store your important person information in configuration or as environmental variables. Python dotenv allows loading environment variable from `.env` files for easier access and better security. -python-dotenv package: https://pypi.org/project/python-dotenv/ +python-dotenv package: https://pypi.org/project/python-dotenv/ \ Detailed instructions: https://github.com/swanchain/python-swan-sdk/tree/release/v0.0.2/docs/configuration.md ## Quick Start Guide for Swan SDK diff --git a/setup.py b/setup.py index 632c5f1b..c6d8257e 100644 --- a/setup.py +++ b/setup.py @@ -7,8 +7,8 @@ long_description = fh.read() setup( - name="swan-sdk", - version="0.0.1", + name="orchestrator-sdk", + version="0.0.2", packages=['swan.api', 'swan.common', 'swan.contract', 'swan.object', 'swan.contract.abi'], # package_data={'swan.contract.abi': ['swan/contract/abi/PaymentContract.json', 'swan/contract/abi/SwanToken.json']}, include_package_data=True, @@ -16,8 +16,8 @@ long_description=long_description, long_description_content_type="text/markdown", url="https://github.com/swanchain/orchestrator-sdk", - author="SwanCloud", - author_email="swan.development@nbai.io", + author="ZihangChenNBAI", + author_email="zhchen@nbai.io", license="MIT", classifiers=[ "License :: OSI Approved :: MIT License", From 2d4940063b2ad562f1134dfc3046061d49a7b2a5 Mon Sep 17 00:00:00 2001 From: Zihang Chen <159826530+ZihangChenNBAI@users.noreply.github.com> Date: Fri, 3 May 2024 16:59:28 -0400 Subject: [PATCH 09/10] updated invite link --- PIPRELEASEDOC.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/PIPRELEASEDOC.md b/PIPRELEASEDOC.md index 68c87142..eb70387a 100644 --- a/PIPRELEASEDOC.md +++ b/PIPRELEASEDOC.md @@ -1,2 +1,2 @@ [![Made by FilSwan](https://img.shields.io/badge/made%20by-FilSwan-green.svg)](https://www.filswan.com/) -[![Chat on discord](https://img.shields.io/badge/join%20-discord-brightgreen.svg)](https://discord.com/invite/KKGhy8ZqzK) \ No newline at end of file +[![Chat on discord](https://img.shields.io/badge/join%20-discord-brightgreen.svg)](https://discord.com/invite/swanchain) \ No newline at end of file From 72f318b9edf996e70e1488ff6b141855078649d9 Mon Sep 17 00:00:00 2001 From: Zihang Chen <159826530+ZihangChenNBAI@users.noreply.github.com> Date: Fri, 3 May 2024 17:00:57 -0400 Subject: [PATCH 10/10] updated invite link --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index db1d1fb1..af513234 100644 --- a/README.md +++ b/README.md @@ -1,7 +1,7 @@ # PYTHON SWAN SDK [![Made by FilSwan](https://img.shields.io/badge/made%20by-FilSwan-green.svg)](https://www.filswan.com/) -[![Chat on discord](https://img.shields.io/badge/join%20-discord-brightgreen.svg)](https://discord.com/invite/KKGhy8ZqzK) +[![Chat on discord](https://img.shields.io/badge/join%20-discord-brightgreen.svg)](https://discord.com/invite/swanchain) ## Table Of Contents - [Overview](#overview)