-
Notifications
You must be signed in to change notification settings - Fork 95
SeBS Cloudflare Compatibility & 311/130 Benchmark Translation to Nodejs #274
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
userlaurin
wants to merge
57
commits into
spcl:master
Choose a base branch
from
ldzgch:master
base: master
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from all commits
Commits
Show all changes
57 commits
Select commit
Hold shift + click to select a range
6d89544
initial sebs cloudflare infra, functions, config, triggers. readme in…
f0c60e0
systems.json cloudflare config
MisterMM23 e5fa13a
highly incomplete work on benchmark wrappers, using R2 and KV.
ade8ea6
wrappers - changes to handler and storage - can now run benchmark 110…
fe92792
just some changes. storage still not properly tested...
024c7c9
translated wrapper to js
6e2e1be
concept for r2 storage
MisterMM23 5218ec8
Merge branch 'master' of github.com:ldzgch/serverless-benchmarks-clou…
MisterMM23 07f757c
used output from workers as analytics measurements in sebs
e98d51d
last changes necessary for sebs to run cloudflare. now just the stora…
ca3cb3f
javascript wrapper with polyfills reading from r2. r2 implementation,…
47d8709
adapted handler to measure invocation time
bba6a92
fixed the fs polyfill to also support write operationst to r2 storage…
c0b988b
added compatibility for benchmarks 100 in nodejs. translated all 100 …
07865e0
current situation where asyncio cannot run the async function
debe486
dynamically add async to benchmark function *shrug*
05ef1b1
nosql updates
4c3d69d
idea for cicrumvention of asyncio
MisterMM23 e82a6f8
wrappers - run_sync for storage.py
99f1220
nosql wrapper uses run_sync
3fb2915
cloudflare nodejs wrapper without r2 as fs polyfill, just node_compat…
d63e968
cleanup nodejs deployment cloudflare, no uploading files necessary an…
eeb049b
add folder structure to python code package
c0c1e50
nosql wrapper - duarble object - may work
78522dd
fix python. 110 runs for me.
bdf49bf
make it read the requirements.txt when it has a number
41a8f68
durable objects compatibility for nodejs
8790c0b
asyncified the function calls...
a27013e
added request polyfill for benchmark 120
93163fd
fix python vendored modules
0897b7f
Merge branch 'master' of github.com:ldzgch/serverless-benchmarks-clou…
fa1a7b5
fixed r2 usage for 120, 311
f5a356d
support for cloudflare containers (python and nodejs), container work…
3d0f5e7
bigger container for python containers
f4ffe41
sleep delay longer
f2dba58
request_id has to be string
932fbd3
update container fixed
eafd023
fixed benchmark wrapper request ids for experiment results as well as…
b02cb08
extract memory correctly
07f04a0
pyiodide does not support resource module for memory measurement
3b3bec4
timing fix for cloudflare handler
e0a6f6b
fixed python timing issue
46d75ca
removed faulty nodejs implementations of 000 bmks
50cf891
removed unnecessary logging
0267fe1
removed experiments.json and package*.json
6115f9c
updated cloudflare readme to reflect final changes
5049c42
has platform check according to convention, durable object items remo…
0d56eb1
updated readme to document the correct return object structure by the…
17f80dd
documented cold start tracking limitation
43d6329
removed unreachable return statement in cloudflare.py
7aafcc0
small fix to use public property
c581639
small fix for public field in durable objects
8516207
converted nosql client calls to async and removed the corresponding p…
8b7003b
Fix instance variable naming in nosql_do class
ldzgch 5d9e36d
Rename class instance reference from nosql to nosql_kv
ldzgch 8d4cc0f
Apply suggestions from code review - storage.py
ldzgch 46e7169
Apply suggestions from code review
ldzgch File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,6 +1,6 @@ | ||
| { | ||
| "timeout": 120, | ||
| "memory": 128, | ||
| "languages": ["python", "nodejs"], | ||
| "languages": ["python"], | ||
| "modules": [] | ||
| } |
56 changes: 56 additions & 0 deletions
56
benchmarks/100.webapps/120.uploader/python/function_cloudflare.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,56 @@ | ||
|
|
||
| import datetime | ||
| import os | ||
|
|
||
| from pyodide.ffi import run_sync | ||
| from pyodide.http import pyfetch | ||
|
|
||
| from . import storage | ||
| client = storage.storage.get_instance() | ||
|
|
||
| SEBS_USER_AGENT = "SeBS/1.2 (https://github.com/spcl/serverless-benchmarks) SeBS Benchmark Suite/1.2" | ||
|
|
||
| async def do_request(url, download_path): | ||
| headers = {'User-Agent': SEBS_USER_AGENT} | ||
|
|
||
| res = await pyfetch(url, headers=headers) | ||
| bs = await res.bytes() | ||
|
|
||
| with open(download_path, 'wb') as f: | ||
| f.write(bs) | ||
|
|
||
| def handler(event): | ||
|
|
||
| bucket = event.get('bucket').get('bucket') | ||
| output_prefix = event.get('bucket').get('output') | ||
| url = event.get('object').get('url') | ||
| name = os.path.basename(url) | ||
| download_path = '/tmp/{}'.format(name) | ||
|
|
||
| process_begin = datetime.datetime.now() | ||
|
|
||
| run_sync(do_request(url, download_path)) | ||
|
|
||
| size = os.path.getsize(download_path) | ||
| process_end = datetime.datetime.now() | ||
|
|
||
| upload_begin = datetime.datetime.now() | ||
| key_name = client.upload(bucket, os.path.join(output_prefix, name), download_path) | ||
| upload_end = datetime.datetime.now() | ||
|
|
||
| process_time = (process_end - process_begin) / datetime.timedelta(microseconds=1) | ||
| upload_time = (upload_end - upload_begin) / datetime.timedelta(microseconds=1) | ||
| return { | ||
| 'result': { | ||
| 'bucket': bucket, | ||
| 'url': url, | ||
| 'key': key_name | ||
| }, | ||
| 'measurement': { | ||
| 'download_time': 0, | ||
| 'download_size': 0, | ||
| 'upload_time': upload_time, | ||
| 'upload_size': size, | ||
| 'compute_time': process_time | ||
| } | ||
| } |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,78 @@ | ||
| const nosql = require('./nosql'); | ||
|
|
||
| const nosqlClient = nosql.nosql.get_instance(); | ||
| const nosqlTableName = "shopping_cart"; | ||
|
|
||
| async function addProduct(cartId, productId, productName, price, quantity) { | ||
| await nosqlClient.insert( | ||
| nosqlTableName, | ||
| ["cart_id", cartId], | ||
| ["product_id", productId], | ||
| { price: price, quantity: quantity, name: productName } | ||
| ); | ||
| } | ||
|
|
||
| async function getProducts(cartId, productId) { | ||
| return await nosqlClient.get( | ||
| nosqlTableName, | ||
| ["cart_id", cartId], | ||
| ["product_id", productId] | ||
| ); | ||
| } | ||
|
|
||
| async function queryProducts(cartId) { | ||
| const res = await nosqlClient.query( | ||
| nosqlTableName, | ||
| ["cart_id", cartId], | ||
| "product_id" | ||
| ); | ||
|
|
||
| const products = []; | ||
| let priceSum = 0; | ||
| let quantitySum = 0; | ||
|
|
||
| for (const product of res) { | ||
| products.push(product.name); | ||
| priceSum += product.price; | ||
| quantitySum += product.quantity; | ||
| } | ||
|
|
||
| const avgPrice = quantitySum > 0 ? priceSum / quantitySum : 0.0; | ||
|
|
||
| return { | ||
| products: products, | ||
| total_cost: priceSum, | ||
| avg_price: avgPrice | ||
| }; | ||
| } | ||
|
|
||
| exports.handler = async function(event) { | ||
| const results = []; | ||
|
|
||
| for (const request of event.requests) { | ||
| const route = request.route; | ||
| const body = request.body; | ||
| let res; | ||
|
|
||
| if (route === "PUT /cart") { | ||
| await addProduct( | ||
| body.cart, | ||
| body.product_id, | ||
| body.name, | ||
| body.price, | ||
| body.quantity | ||
| ); | ||
| res = {}; | ||
| } else if (route === "GET /cart/{id}") { | ||
| res = await getProducts(body.cart, request.path.id); | ||
| } else if (route === "GET /cart") { | ||
| res = await queryProducts(body.cart); | ||
| } else { | ||
| throw new Error(`Unknown request route: ${route}`); | ||
| } | ||
|
|
||
| results.push(res); | ||
| } | ||
|
|
||
| return { result: results }; | ||
| }; | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,9 @@ | ||
| { | ||
| "name": "crud-api", | ||
| "version": "1.0.0", | ||
| "description": "CRUD API benchmark", | ||
| "author": "", | ||
| "license": "", | ||
| "dependencies": { | ||
| } | ||
| } |
147 changes: 147 additions & 0 deletions
147
benchmarks/300.utilities/311.compression/nodejs/function.js
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,147 @@ | ||
| const fs = require('fs'); | ||
| const path = require('path'); | ||
| const zlib = require('zlib'); | ||
| const { v4: uuidv4 } = require('uuid'); | ||
| const storage = require('./storage'); | ||
|
|
||
| let storage_handler = new storage.storage(); | ||
|
|
||
| /** | ||
| * Calculate total size of a directory recursively | ||
| * @param {string} directory - Path to directory | ||
| * @returns {number} Total size in bytes | ||
| */ | ||
| function parseDirectory(directory) { | ||
| let size = 0; | ||
|
|
||
| function walkDir(dir) { | ||
| const files = fs.readdirSync(dir); | ||
| for (const file of files) { | ||
| const filepath = path.join(dir, file); | ||
| const stat = fs.statSync(filepath); | ||
| if (stat.isDirectory()) { | ||
| walkDir(filepath); | ||
| } else { | ||
| size += stat.size; | ||
| } | ||
| } | ||
| } | ||
|
|
||
| walkDir(directory); | ||
| return size; | ||
| } | ||
|
|
||
| /** | ||
| * Create a simple tar.gz archive from a directory using native zlib | ||
| * This creates a gzip-compressed tar archive without external dependencies | ||
| * @param {string} sourceDir - Directory to compress | ||
| * @param {string} outputPath - Path for the output archive file | ||
| * @returns {Promise<void>} | ||
| */ | ||
| async function createTarGzArchive(sourceDir, outputPath) { | ||
| // Create a simple tar-like format (concatenated files with headers) | ||
| const files = []; | ||
|
|
||
| function collectFiles(dir, baseDir = '') { | ||
| const entries = fs.readdirSync(dir); | ||
| for (const entry of entries) { | ||
| const fullPath = path.join(dir, entry); | ||
| const relativePath = path.join(baseDir, entry); | ||
| const stat = fs.statSync(fullPath); | ||
|
|
||
| if (stat.isDirectory()) { | ||
| collectFiles(fullPath, relativePath); | ||
| } else { | ||
| files.push({ | ||
| path: relativePath, | ||
| fullPath: fullPath, | ||
| size: stat.size | ||
| }); | ||
| } | ||
| } | ||
| } | ||
|
|
||
| collectFiles(sourceDir); | ||
|
|
||
| // Create a concatenated buffer of all files with simple headers | ||
| const chunks = []; | ||
| for (const file of files) { | ||
| const content = fs.readFileSync(file.fullPath); | ||
| // Simple header: filename length (4 bytes) + filename + content length (4 bytes) + content | ||
| const pathBuffer = Buffer.from(file.path); | ||
| const pathLengthBuffer = Buffer.allocUnsafe(4); | ||
| pathLengthBuffer.writeUInt32BE(pathBuffer.length, 0); | ||
| const contentLengthBuffer = Buffer.allocUnsafe(4); | ||
| contentLengthBuffer.writeUInt32BE(content.length, 0); | ||
|
|
||
| chunks.push(pathLengthBuffer); | ||
| chunks.push(pathBuffer); | ||
| chunks.push(contentLengthBuffer); | ||
| chunks.push(content); | ||
| } | ||
|
|
||
| const combined = Buffer.concat(chunks); | ||
|
|
||
| // Compress using gzip | ||
| const compressed = zlib.gzipSync(combined, { level: 9 }); | ||
| fs.writeFileSync(outputPath, compressed); | ||
| } | ||
|
|
||
| exports.handler = async function(event) { | ||
| const bucket = event.bucket.bucket; | ||
| const input_prefix = event.bucket.input; | ||
| const output_prefix = event.bucket.output; | ||
| const key = event.object.key; | ||
|
|
||
| // Create unique download path | ||
| const download_path = path.join('/tmp', `${key}-${uuidv4()}`); | ||
| fs.mkdirSync(download_path, { recursive: true }); | ||
|
|
||
| // Download directory from storage | ||
| const s3_download_begin = Date.now(); | ||
| await storage_handler.download_directory(bucket, path.join(input_prefix, key), download_path); | ||
| const s3_download_stop = Date.now(); | ||
|
|
||
| // Calculate size of downloaded files | ||
| const size = parseDirectory(download_path); | ||
|
|
||
| // Compress directory | ||
| const compress_begin = Date.now(); | ||
| const archive_name = `${key}.tar.gz`; | ||
| const archive_path = path.join(download_path, archive_name); | ||
| await createTarGzArchive(download_path, archive_path); | ||
| const compress_end = Date.now(); | ||
|
|
||
| // Get archive size | ||
| const archive_size = fs.statSync(archive_path).size; | ||
|
|
||
| // Upload compressed archive | ||
| const s3_upload_begin = Date.now(); | ||
| const [key_name, uploadPromise] = storage_handler.upload( | ||
| bucket, | ||
| path.join(output_prefix, archive_name), | ||
| archive_path | ||
| ); | ||
| await uploadPromise; | ||
| const s3_upload_stop = Date.now(); | ||
|
|
||
| // Calculate times in microseconds | ||
| const download_time = (s3_download_stop - s3_download_begin) * 1000; | ||
| const upload_time = (s3_upload_stop - s3_upload_begin) * 1000; | ||
| const process_time = (compress_end - compress_begin) * 1000; | ||
|
|
||
| return { | ||
| result: { | ||
| bucket: bucket, | ||
| key: key_name | ||
| }, | ||
| measurement: { | ||
| download_time: download_time, | ||
| download_size: size, | ||
| upload_time: upload_time, | ||
| upload_size: archive_size, | ||
| compute_time: process_time | ||
| } | ||
| }; | ||
| }; | ||
|
|
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,9 @@ | ||
| { | ||
| "name": "compression-benchmark", | ||
| "version": "1.0.0", | ||
| "description": "Compression benchmark for serverless platforms", | ||
| "main": "function.js", | ||
| "dependencies": { | ||
| "uuid": "^10.0.0" | ||
| } | ||
| } |
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
cat -n benchmarks/100.webapps/130.crud-api/nodejs/function.js | head -80Repository: spcl/serverless-benchmarks
Length of output: 2363
Fix the price aggregation logic.
Lines 36 and 40 contain incorrect calculations for a shopping cart total:
product.pricedirectly topriceSum, ignoring the quantity for that itemsum(price) / sum(quantity), which is not a meaningful metricSince
product.priceis a unit price (as shown in theaddProductfunction at line 11),total_costshould besum(price × quantity), notsum(price). For example, 2 units at $10 + 3 units at $5 should total $35, not $15.Proposed fix:
for (const product of res) { products.push(product.name); - priceSum += product.price; + priceSum += product.price * product.quantity; quantitySum += product.quantity; } const avgPrice = quantitySum > 0 ? priceSum / quantitySum : 0.0;This calculates the correct total cost and average price per item.
📝 Committable suggestion
🤖 Prompt for AI Agents