Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
57 commits
Select commit Hold shift + click to select a range
6d89544
initial sebs cloudflare infra, functions, config, triggers. readme in…
Oct 31, 2025
f0c60e0
systems.json cloudflare config
MisterMM23 Nov 2, 2025
e5fa13a
highly incomplete work on benchmark wrappers, using R2 and KV.
Nov 2, 2025
ade8ea6
wrappers - changes to handler and storage - can now run benchmark 110…
Nov 8, 2025
fe92792
just some changes. storage still not properly tested...
Nov 9, 2025
024c7c9
translated wrapper to js
Nov 10, 2025
6e2e1be
concept for r2 storage
MisterMM23 Nov 10, 2025
5218ec8
Merge branch 'master' of github.com:ldzgch/serverless-benchmarks-clou…
MisterMM23 Nov 10, 2025
07f757c
used output from workers as analytics measurements in sebs
Nov 10, 2025
e98d51d
last changes necessary for sebs to run cloudflare. now just the stora…
Nov 10, 2025
ca3cb3f
javascript wrapper with polyfills reading from r2. r2 implementation,…
Nov 11, 2025
47d8709
adapted handler to measure invocation time
Nov 11, 2025
bba6a92
fixed the fs polyfill to also support write operationst to r2 storage…
Nov 12, 2025
c0b988b
added compatibility for benchmarks 100 in nodejs. translated all 100 …
Nov 12, 2025
07865e0
current situation where asyncio cannot run the async function
Nov 13, 2025
debe486
dynamically add async to benchmark function *shrug*
Nov 16, 2025
05ef1b1
nosql updates
Nov 16, 2025
4c3d69d
idea for cicrumvention of asyncio
MisterMM23 Nov 17, 2025
e82a6f8
wrappers - run_sync for storage.py
Nov 17, 2025
99f1220
nosql wrapper uses run_sync
Nov 19, 2025
3fb2915
cloudflare nodejs wrapper without r2 as fs polyfill, just node_compat…
Nov 19, 2025
d63e968
cleanup nodejs deployment cloudflare, no uploading files necessary an…
Nov 19, 2025
eeb049b
add folder structure to python code package
Nov 23, 2025
c0c1e50
nosql wrapper - duarble object - may work
Nov 28, 2025
78522dd
fix python. 110 runs for me.
Nov 28, 2025
bdf49bf
make it read the requirements.txt when it has a number
Nov 28, 2025
41a8f68
durable objects compatibility for nodejs
Nov 30, 2025
8790c0b
asyncified the function calls...
Nov 30, 2025
a27013e
added request polyfill for benchmark 120
Nov 30, 2025
93163fd
fix python vendored modules
Dec 1, 2025
0897b7f
Merge branch 'master' of github.com:ldzgch/serverless-benchmarks-clou…
Dec 1, 2025
fa1a7b5
fixed r2 usage for 120, 311
Dec 4, 2025
f5a356d
support for cloudflare containers (python and nodejs), container work…
Dec 7, 2025
3d0f5e7
bigger container for python containers
Dec 8, 2025
f4ffe41
sleep delay longer
Dec 8, 2025
f2dba58
request_id has to be string
Dec 8, 2025
932fbd3
update container fixed
Dec 8, 2025
eafd023
fixed benchmark wrapper request ids for experiment results as well as…
Dec 13, 2025
b02cb08
extract memory correctly
Dec 13, 2025
07f04a0
pyiodide does not support resource module for memory measurement
Dec 14, 2025
3b3bec4
timing fix for cloudflare handler
Dec 15, 2025
e0a6f6b
fixed python timing issue
Dec 15, 2025
46d75ca
removed faulty nodejs implementations of 000 bmks
Jan 5, 2026
50cf891
removed unnecessary logging
Jan 5, 2026
0267fe1
removed experiments.json and package*.json
Jan 6, 2026
6115f9c
updated cloudflare readme to reflect final changes
Jan 6, 2026
5049c42
has platform check according to convention, durable object items remo…
Jan 6, 2026
0d56eb1
updated readme to document the correct return object structure by the…
Jan 6, 2026
17f80dd
documented cold start tracking limitation
Jan 6, 2026
43d6329
removed unreachable return statement in cloudflare.py
Jan 6, 2026
7aafcc0
small fix to use public property
Jan 6, 2026
c581639
small fix for public field in durable objects
Jan 6, 2026
8516207
converted nosql client calls to async and removed the corresponding p…
Jan 7, 2026
8b7003b
Fix instance variable naming in nosql_do class
ldzgch Jan 7, 2026
5d9e36d
Rename class instance reference from nosql to nosql_kv
ldzgch Jan 7, 2026
8d4cc0f
Apply suggestions from code review - storage.py
ldzgch Jan 7, 2026
46e7169
Apply suggestions from code review
ldzgch Jan 7, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"timeout": 120,
"memory": 128,
"languages": ["python", "nodejs"],
"languages": ["python"],
"modules": []
}
56 changes: 56 additions & 0 deletions benchmarks/100.webapps/120.uploader/python/function_cloudflare.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@

import datetime
import os

from pyodide.ffi import run_sync
from pyodide.http import pyfetch

from . import storage
client = storage.storage.get_instance()

SEBS_USER_AGENT = "SeBS/1.2 (https://github.com/spcl/serverless-benchmarks) SeBS Benchmark Suite/1.2"

async def do_request(url, download_path):
headers = {'User-Agent': SEBS_USER_AGENT}

res = await pyfetch(url, headers=headers)
bs = await res.bytes()

with open(download_path, 'wb') as f:
f.write(bs)

def handler(event):

bucket = event.get('bucket').get('bucket')
output_prefix = event.get('bucket').get('output')
url = event.get('object').get('url')
name = os.path.basename(url)
download_path = '/tmp/{}'.format(name)

process_begin = datetime.datetime.now()

run_sync(do_request(url, download_path))

size = os.path.getsize(download_path)
process_end = datetime.datetime.now()

upload_begin = datetime.datetime.now()
key_name = client.upload(bucket, os.path.join(output_prefix, name), download_path)
upload_end = datetime.datetime.now()

process_time = (process_end - process_begin) / datetime.timedelta(microseconds=1)
upload_time = (upload_end - upload_begin) / datetime.timedelta(microseconds=1)
return {
'result': {
'bucket': bucket,
'url': url,
'key': key_name
},
'measurement': {
'download_time': 0,
'download_size': 0,
'upload_time': upload_time,
'upload_size': size,
'compute_time': process_time
}
}
78 changes: 78 additions & 0 deletions benchmarks/100.webapps/130.crud-api/nodejs/function.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,78 @@
const nosql = require('./nosql');

const nosqlClient = nosql.nosql.get_instance();
const nosqlTableName = "shopping_cart";

async function addProduct(cartId, productId, productName, price, quantity) {
await nosqlClient.insert(
nosqlTableName,
["cart_id", cartId],
["product_id", productId],
{ price: price, quantity: quantity, name: productName }
);
}

async function getProducts(cartId, productId) {
return await nosqlClient.get(
nosqlTableName,
["cart_id", cartId],
["product_id", productId]
);
}

async function queryProducts(cartId) {
const res = await nosqlClient.query(
nosqlTableName,
["cart_id", cartId],
"product_id"
);

const products = [];
let priceSum = 0;
let quantitySum = 0;

for (const product of res) {
products.push(product.name);
priceSum += product.price;
quantitySum += product.quantity;
}

const avgPrice = quantitySum > 0 ? priceSum / quantitySum : 0.0;

return {
products: products,
total_cost: priceSum,
avg_price: avgPrice
};
}
Comment on lines +23 to +47
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

cat -n benchmarks/100.webapps/130.crud-api/nodejs/function.js | head -80

Repository: spcl/serverless-benchmarks

Length of output: 2363


Fix the price aggregation logic.

Lines 36 and 40 contain incorrect calculations for a shopping cart total:

  • Line 36 adds product.price directly to priceSum, ignoring the quantity for that item
  • Line 40 calculates average price as sum(price) / sum(quantity), which is not a meaningful metric

Since product.price is a unit price (as shown in the addProduct function at line 11), total_cost should be sum(price × quantity), not sum(price). For example, 2 units at $10 + 3 units at $5 should total $35, not $15.

Proposed fix:

   for (const product of res) {
     products.push(product.name);
-    priceSum += product.price;
+    priceSum += product.price * product.quantity;
     quantitySum += product.quantity;
   }
 
   const avgPrice = quantitySum > 0 ? priceSum / quantitySum : 0.0;

This calculates the correct total cost and average price per item.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
async function queryProducts(cartId) {
const res = await nosqlClient.query(
nosqlTableName,
["cart_id", cartId],
"product_id"
);
const products = [];
let priceSum = 0;
let quantitySum = 0;
for (const product of res) {
products.push(product.name);
priceSum += product.price;
quantitySum += product.quantity;
}
const avgPrice = quantitySum > 0 ? priceSum / quantitySum : 0.0;
return {
products: products,
total_cost: priceSum,
avg_price: avgPrice
};
}
async function queryProducts(cartId) {
const res = await nosqlClient.query(
nosqlTableName,
["cart_id", cartId],
"product_id"
);
const products = [];
let priceSum = 0;
let quantitySum = 0;
for (const product of res) {
products.push(product.name);
priceSum += product.price * product.quantity;
quantitySum += product.quantity;
}
const avgPrice = quantitySum > 0 ? priceSum / quantitySum : 0.0;
return {
products: products,
total_cost: priceSum,
avg_price: avgPrice
};
}
🤖 Prompt for AI Agents
In @benchmarks/100.webapps/130.crud-api/nodejs/function.js around lines 23 - 47,
The total and average calculations in queryProducts are incorrect: priceSum
currently adds product.price without accounting for product.quantity and
avgPrice divides sum(prices) by sum(quantity), which is meaningless; change the
aggregation so priceSum accumulates product.price * product.quantity (i.e.,
priceSum += product.price * product.quantity) and compute avgPrice as
quantitySum > 0 ? priceSum / quantitySum : 0.0 (average unit price per item),
keeping product names appended to products and returning total_cost: priceSum
and avg_price: avgPrice.


exports.handler = async function(event) {
const results = [];

for (const request of event.requests) {
const route = request.route;
const body = request.body;
let res;

if (route === "PUT /cart") {
await addProduct(
body.cart,
body.product_id,
body.name,
body.price,
body.quantity
);
res = {};
} else if (route === "GET /cart/{id}") {
res = await getProducts(body.cart, request.path.id);
} else if (route === "GET /cart") {
res = await queryProducts(body.cart);
} else {
throw new Error(`Unknown request route: ${route}`);
}

results.push(res);
}

return { result: results };
};
9 changes: 9 additions & 0 deletions benchmarks/100.webapps/130.crud-api/nodejs/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
{
"name": "crud-api",
"version": "1.0.0",
"description": "CRUD API benchmark",
"author": "",
"license": "",
"dependencies": {
}
}
147 changes: 147 additions & 0 deletions benchmarks/300.utilities/311.compression/nodejs/function.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,147 @@
const fs = require('fs');
const path = require('path');
const zlib = require('zlib');
const { v4: uuidv4 } = require('uuid');
const storage = require('./storage');

let storage_handler = new storage.storage();

/**
* Calculate total size of a directory recursively
* @param {string} directory - Path to directory
* @returns {number} Total size in bytes
*/
function parseDirectory(directory) {
let size = 0;

function walkDir(dir) {
const files = fs.readdirSync(dir);
for (const file of files) {
const filepath = path.join(dir, file);
const stat = fs.statSync(filepath);
if (stat.isDirectory()) {
walkDir(filepath);
} else {
size += stat.size;
}
}
}

walkDir(directory);
return size;
}

/**
* Create a simple tar.gz archive from a directory using native zlib
* This creates a gzip-compressed tar archive without external dependencies
* @param {string} sourceDir - Directory to compress
* @param {string} outputPath - Path for the output archive file
* @returns {Promise<void>}
*/
async function createTarGzArchive(sourceDir, outputPath) {
// Create a simple tar-like format (concatenated files with headers)
const files = [];

function collectFiles(dir, baseDir = '') {
const entries = fs.readdirSync(dir);
for (const entry of entries) {
const fullPath = path.join(dir, entry);
const relativePath = path.join(baseDir, entry);
const stat = fs.statSync(fullPath);

if (stat.isDirectory()) {
collectFiles(fullPath, relativePath);
} else {
files.push({
path: relativePath,
fullPath: fullPath,
size: stat.size
});
}
}
}

collectFiles(sourceDir);

// Create a concatenated buffer of all files with simple headers
const chunks = [];
for (const file of files) {
const content = fs.readFileSync(file.fullPath);
// Simple header: filename length (4 bytes) + filename + content length (4 bytes) + content
const pathBuffer = Buffer.from(file.path);
const pathLengthBuffer = Buffer.allocUnsafe(4);
pathLengthBuffer.writeUInt32BE(pathBuffer.length, 0);
const contentLengthBuffer = Buffer.allocUnsafe(4);
contentLengthBuffer.writeUInt32BE(content.length, 0);

chunks.push(pathLengthBuffer);
chunks.push(pathBuffer);
chunks.push(contentLengthBuffer);
chunks.push(content);
}

const combined = Buffer.concat(chunks);

// Compress using gzip
const compressed = zlib.gzipSync(combined, { level: 9 });
fs.writeFileSync(outputPath, compressed);
}

exports.handler = async function(event) {
const bucket = event.bucket.bucket;
const input_prefix = event.bucket.input;
const output_prefix = event.bucket.output;
const key = event.object.key;

// Create unique download path
const download_path = path.join('/tmp', `${key}-${uuidv4()}`);
fs.mkdirSync(download_path, { recursive: true });

// Download directory from storage
const s3_download_begin = Date.now();
await storage_handler.download_directory(bucket, path.join(input_prefix, key), download_path);
const s3_download_stop = Date.now();

// Calculate size of downloaded files
const size = parseDirectory(download_path);

// Compress directory
const compress_begin = Date.now();
const archive_name = `${key}.tar.gz`;
const archive_path = path.join(download_path, archive_name);
await createTarGzArchive(download_path, archive_path);
const compress_end = Date.now();

// Get archive size
const archive_size = fs.statSync(archive_path).size;

// Upload compressed archive
const s3_upload_begin = Date.now();
const [key_name, uploadPromise] = storage_handler.upload(
bucket,
path.join(output_prefix, archive_name),
archive_path
);
await uploadPromise;
const s3_upload_stop = Date.now();

// Calculate times in microseconds
const download_time = (s3_download_stop - s3_download_begin) * 1000;
const upload_time = (s3_upload_stop - s3_upload_begin) * 1000;
const process_time = (compress_end - compress_begin) * 1000;

return {
result: {
bucket: bucket,
key: key_name
},
measurement: {
download_time: download_time,
download_size: size,
upload_time: upload_time,
upload_size: archive_size,
compute_time: process_time
}
};
};

9 changes: 9 additions & 0 deletions benchmarks/300.utilities/311.compression/nodejs/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
{
"name": "compression-benchmark",
"version": "1.0.0",
"description": "Compression benchmark for serverless platforms",
"main": "function.js",
"dependencies": {
"uuid": "^10.0.0"
}
}
Loading