Comparing Deno vs Node vs Bun

August 5, 20240 commentsBun, JavaScript

This is an unscientific comparison update from previous blog posts that compared Node and Bun, but didn't compare with Deno.

Temperature conversion

From Converting Celsius to Fahrenheit round-up it compared a super simple script that just prints a couple of lines of text after some basic computation. If you include Deno on that run you get:

``````
❯ hyperfine --shell=none --warmup 3 "bun run conversion.js" "node conversion.js" "deno run conversion.js"
Benchmark 1: bun run conversion.js
Time (mean ± σ):      22.2 ms ±   2.1 ms    [User: 12.4 ms, System: 8.6 ms]
Range (min … max):    20.6 ms …  36.0 ms    136 runs

Warning: Statistical outliers were detected. Consider re-running this benchmark on a quiet system without any interferences from other programs. It might help to use the '--warmup' or '--prepare' options.

...

Summary
bun run conversion.js ran
1.97 ± 0.35 times faster than deno run conversion.js
2.41 ± 0.39 times faster than node conversion.js
``````

Note that `bun run` and `deno run` both support `.ts` files whereas Node needs it to be `.js` (unless you use something like `--require @swc-node/register`). So in this benchmark, I let `bun run` and `deno run` use the `.js` version.

``````
❯ deno --version
deno 1.45.2 (release, x86_64-apple-darwin)
v8 12.7.224.12
typescript 5.5.2

❯ node --version
v20.14.0

❯ bun --version
1.1.21
``````

Leibniz formula

In Leibniz formula for π in Python, JavaScript, and Ruby I wrote a simple program that computes the value of π using the Leibniz formula. It became a comparison of that code implementation in Python vs. Ruby vs. Node.

But let's redo the test with Bun and Deno too. Code was

``````
let sum = 0;
let estimate = 0;
let i = 0;
const epsilon = 0.0001;

while (Math.abs(estimate - Math.PI) > epsilon) {
sum += (-1) ** i / (2 * i + 1);
estimate = sum * 4;
i += 1;
}
console.log(
`After \${i} iterations, the estimate is \${estimate} and the real pi is \${Math.PI} ` +
`(difference of \${Math.abs(estimate - Math.PI)})`
);
``````

Running it once, it prints:

``````
❯ deno run pi.js
After 10000 iterations, the estimate is 3.1414926535900345 and the real pi is 3.141592653589793 (difference of 0.0000999999997586265)
``````

Running them becomes more of a measurement of how fast the programs start rather than how fast they run, but it's nevertheless and interesting to know too:

``````
❯ hyperfine --warmup 3 "node pi.js" "bun run pi.js" "deno run pi.js"
Benchmark 1: node pi.js
Time (mean ± σ):      54.9 ms ±   6.5 ms    [User: 42.6 ms, System: 11.3 ms]
Range (min … max):    50.2 ms …  83.9 ms    48 runs

Warning: Statistical outliers were detected. Consider re-running this benchmark on a quiet system without any interferences from other programs. It might help to use the '--warmup' or '--prepare' options.

...

Summary
bun run pi.js ran
1.92 ± 1.01 times faster than deno run pi.js
2.37 ± 0.31 times faster than node pi.js
``````

Conclusion

Both of these programs that I'm comparing with are super trivial and take virtually no time to run, once they've started. So it becomes more a test of warm-start performance. Alas, it's cool to see that both Deno and Bun make a better job of it here. Bun is almost 2x faster than Deno and 2.5x faster than Node.

Converting Celsius to Fahrenheit round-up

July 22, 20240 commentsGo, Node, Python, Bun, Ruby, Rust, JavaScript

In the last couple of days, I've created variations of a simple algorithm to demonstrate how Celcius and Fahrenheit seem to relate to each other if you "mirror the number".
It wasn't supposed to be about the programming language. Still, I used Python in the first one and I noticed that since the code is simple, it could be fun to write variants of it in other languages.

It was a fun exercise.

And speaking of fun, I couldn't help but to throw in a benchmark using `hyperfine` that measures, essentially, how fast these CLIs can start up. The results look like this:

``````
Summary
./conversion-rs ran
1.31 ± 1.30 times faster than ./conversion-go
1.88 ± 1.33 times faster than ./conversion-cr
7.15 ± 4.64 times faster than bun run conversion.ts
14.27 ± 9.48 times faster than python3.12 conversion.py
18.10 ± 12.35 times faster than node conversion.js
67.75 ± 43.80 times faster than ruby conversion.rb
``````

It doesn't prove much, that you didn't expect. But it's fun to see how fast Python 3.12 has become at starting up.

Head on over to https://github.com/peterbe/temperature-conversion to play along. Perhaps you can see some easy optimizations (speed and style).

Converting Celsius to Fahrenheit with TypeScript

July 16, 20240 commentsBun, JavaScript

This is a continuation of Converting Celsius to Fahrenheit with Python, but in TypeScript:

``````
function c2f(c: number): number {
return (c * 9) / 5 + 32;
}

function isMirror(a: number, b: number) {
function massage(n: number) {
if (n < 10) return `0\${n}`;
else if (n >= 100) return massage(n - 100);
return `\${n}`;
}
return reverseString(massage(a)) === massage(b);
}

function reverseString(str: string) {
return str.split("").reverse().join("");
}

function printConversion(c: number, f: number) {
console.log(`\${c}°C ~= \${f}°F`);
}

for (let c = 4; c < 100; c += 12) {
const f = c2f(c);
if (isMirror(c, Math.ceil(f))) {
printConversion(c, Math.ceil(f));
} else if (isMirror(c, Math.floor(f))) {
printConversion(c, Math.floor(f));
} else {
break;
}
}
``````

And when you run it:

``````
❯ bun run conversion.ts
4°C ~= 40°F
16°C ~= 61°F
28°C ~= 82°F
40°C ~= 104°F
52°C ~= 125°F
``````

Introducing hylite - a Node code-syntax-to-HTML highlighter written in Bun

October 3, 20230 commentsNode, Bun, JavaScript

`hylite` is a command line tool for syntax highlight code into HTML. You feed it a file or some snippet of code (plus what language it is) and it returns a string of HTML.

Suppose you have:

``````
❯ cat example.py
# This is example.py
def hello():
return "world"
``````

When you run this through `hylite` you get:

``````
❯ npx hylite example.py
<span class="hljs-keyword">def</span> <span class="hljs-title function_">hello</span>():
<span class="hljs-keyword">return</span> <span class="hljs-string">&quot;world&quot;</span>
``````

Now, if installed with the necessary CSS, it can finally render this:

``````
# This is example.py
def hello():
return "world"
``````

(Note: At the time of writing this, `npx hylite --list-css` or `npx hylite --css` don't work unless you've `git clone` the `github.com/peterbe/hylite` repo)

How I use it

This originated because I loved how `highlight.js` works. It supports numerous languages, can even guess the language, is fast as heck, and the HTML output is compact.

Originally, my personal website, whose backend is in Python/Django, was using `Pygments` to do the syntax highlighting. The problem with that is it doesn't support JSX (or TSX). For example:

``````
export function Bell({ color }: {color: string}) {
return <div style={{ backgroundColor: color }}>Ding!</div>
}
``````

The problem is that Python != Node so to call out to `hylite` I use a sub-process. At the moment, I can't use `bunx` or `npx` because that depends on `\$PATH` and stuff that the server doesn't have. Here's how I call `hylite` from Python:

``````
command = settings.HYLITE_COMMAND.split()
assert language
command.extend(["--language", language, "--wrapped"])
process = subprocess.Popen(
command,
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
text=True,
cwd=settings.HYLITE_DIRECTORY,
)
process.stdin.write(code)
output, error = process.communicate()
``````

The settings are:

``````
HYLITE_DIRECTORY = "/home/django/hylite"
HYLITE_COMMAND = "node dist/index.js"
``````

How I built `hylite`

What's different about `hylite` compared to other JavaScript packages and CLIs like this is that the development requires Bun. It's lovely because it has a built-in test runner, TypeScript transpiler, and it's just so lovely fast at starting for anything you do with it.

In my current view, I see Bun as an equivalent of TypeScript. It's convenient when developing but once stripped away it's just good old JavaScript and you don't have to worry about compatibility.

So I use `bun` for manual testing like `bun run src/index.ts < foo.go` but when it comes time to ship, I run `bun run build` (which executes, with `bun`, the `src/build.ts`) which then builds a `dist/index.js` file which you can run with either `node` or `bun` anywhere.

By the way, the README as a section on Benchmarking. It concludes two things:

1. `node dist/index.js` has the same performance as `bun run dist/index.js`
2. `bunx hylite` is 7x times faster than `npx hylite` but it's bullcrap because `bunx` doesn't check the network if there's a new version (...until you restart your computer)

Parse a CSV file with Bun

September 13, 20230 commentsBun

I'm really excited about Bun and look forward to trying it out more and more.
Today I needed a quick script to parse a CSV file to compute some simple arithmetic on some numbers in it.

To do that, here's what I did:

``````
bun init
bun install csv-simple-parser
code index.ts
``````

And the code:

``````
import parse from "csv-simple-parser";

console.time("total");
const numbers: number[] = [];
const file = Bun.file(process.argv.slice(2)[0]);
type Rec = {
Pageviews: string;
};
const csv = parse(await file.text(), { header: true }) as Rec[];
for (const row of csv) {
numbers.push(parseInt(row["Pageviews"] || "0"));
}
console.timeEnd("total");
console.log("Mean  ", numbers.reduce((a, b) => a + b, 0) / numbers.length);
console.log("Median", numbers.sort()[Math.floor(numbers.length / 2)]);
``````

And running it:

``````
❯ wc -l file.csv
13623 file.csv

❯ /usr/bin/time bun run index.ts file.csv
[8.20ms] total
Mean   7.205534757395581
Median 1
0.04 real         0.03 user         0.01 sys
``````

(On my Intel MacBook Pro...) The reading in the file and parsing the 13k lines took 8.2 milliseconds. The whole execution took 0.04 seconds. Pretty neat.

Hello-world server in Bun vs Fastify

September 9, 20234 commentsNode, JavaScript, Bun

Bun 1.0 just launched and I'm genuinely impressed and intrigued. How long can this madness keep going? I've never built anything substantial with Bun. Just various scripts to get a feel for it.

At work, I recently launched a micro-service that uses Node + Fastify + TypeScript. I'm not going to rewrite it in Bun, but I'm going to get a feel for the difference.

Basic version in Bun

No need for a `package.json` at this point. And that's neat. Create a `src/index.ts` and put this in:

``````
const PORT = parseInt(process.env.PORT || "3000");

Bun.serve({
port: PORT,
fetch(req) {
const url = new URL(req.url);
if (url.pathname === "/") return new Response(`Home page!`);
if (url.pathname === "/json") return Response.json({ hello: "world" });
return new Response(`404!`);
},
});
console.log(`Listening on port \${PORT}`);

``````

What's so cool about the convenience-oriented developer experience of Bun is that it comes with a native way for restarting the server as you're editing the server code:

``````
❯ bun --hot src/index.ts
Listening on port 3000
``````

Let's test it:

``````
❯ xh http://localhost:3000/
HTTP/1.1 200 OK
Content-Length: 10
Content-Type: text/plain;charset=utf-8
Date: Sat, 09 Sep 2023 02:34:29 GMT

❯ xh http://localhost:3000/json
HTTP/1.1 200 OK
Content-Length: 17
Content-Type: application/json;charset=utf-8
Date: Sat, 09 Sep 2023 02:34:35 GMT

{
"hello": "world"
}
``````

Basic version with Node + Fastify + TypeScript

First of all, you'll need to create a `package.json` to install the dependencies, all of which, at this gentle point are built into Bun:

``````
❯ npm i -D ts-node typescript @types/node nodemon
❯ npm i fastify
``````

And edit the `package.json` with some scripts:

``````
"scripts": {
"dev": "nodemon src/index.ts",
"start": "ts-node src/index.ts"
},
``````

And of course, the code itself (`src/index.ts`):

``````
import fastify from "fastify";

const PORT = parseInt(process.env.PORT || "3000");

const server = fastify();

server.get("/", async () => {
});

server.get("/json", (request, reply) => {
reply.send({ hello: "world" });
});

server.listen({ port: PORT }, (err, address) => {
if (err) {
console.error(err);
process.exit(1);
}
console.log(`Server listening at \${address}`);
});
``````

Now run it:

``````
❯ npm run dev

> fastify-hello-world@1.0.0 dev
> nodemon src/index.ts

[nodemon] 3.0.1
[nodemon] to restart at any time, enter `rs`
[nodemon] watching path(s): *.*
[nodemon] watching extensions: ts,json
[nodemon] starting `ts-node src/index.ts`
Server listening at http://[::1]:3000
``````

Let's test it:

``````
❯ xh http://localhost:3000/
HTTP/1.1 200 OK
Connection: keep-alive
Content-Length: 10
Content-Type: text/plain; charset=utf-8
Date: Sat, 09 Sep 2023 02:42:46 GMT
Keep-Alive: timeout=72

❯ xh http://localhost:3000/json
HTTP/1.1 200 OK
Connection: keep-alive
Content-Length: 17
Content-Type: application/json; charset=utf-8
Date: Sat, 09 Sep 2023 02:43:08 GMT
Keep-Alive: timeout=72

{
"hello": "world"
}
``````

For the record, I quite like this little setup. `nodemon` can automatically understand TypeScript. It's a neat minimum if Node is a desire.

Quick benchmark

Bun

Note that this server has no logging or any I/O.

``````
❯ bun src/index.ts
Listening on port 3000
``````

Using `hey` to test 10,000 requests across 100 concurrent clients:

```❯ hey -n 10000 -c 100 http://localhost:3000/

Summary:
Total:    0.2746 secs
Slowest:  0.0167 secs
Fastest:  0.0002 secs
Average:  0.0026 secs
Requests/sec: 36418.8132

Total data:   100000 bytes
Size/request: 10 bytes```

Node + Fastify

``````
❯ npm run start
``````

Using `hey` again:

```❯ hey -n 10000 -c 100 http://localhost:3000/

Summary:
Total:    0.6606 secs
Slowest:  0.0483 secs
Fastest:  0.0001 secs
Average:  0.0065 secs
Requests/sec: 15138.5719

Total data:   100000 bytes
Size/request: 10 bytes```

Serving an HTML file with Bun

``````
Bun.serve({
port: PORT,
fetch(req) {
const url = new URL(req.url);
if (url.pathname === "/") return new Response(`Home page!`);
if (url.pathname === "/json") return Response.json({ hello: "world" });
+   if (url.pathname === "/index.html")
+     return new Response(Bun.file("src/index.html"));
return new Response(`404!`);
},
});
``````

Serves the `src/index.html` file just right:

``````
❯ xh --headers http://localhost:3000/index.html
HTTP/1.1 200 OK
Content-Length: 889
Content-Type: text/html;charset=utf-8
``````

Serving an HTML file with Node + Fastify

First, install the plugin:

`❯ npm i @fastify/static`

And make this change:

``````
+import path from "node:path";
+
import fastify from "fastify";
+import fastifyStatic from "@fastify/static";

const PORT = parseInt(process.env.PORT || "3000");

const server = fastify();

+server.register(fastifyStatic, {
+  root: path.resolve("src"),
+});
+
server.get("/", async () => {
});
server.get("/json", (request, reply) => {
reply.send({ hello: "world" });
});

+server.get("/index.html", (request, reply) => {
+});
+
server.listen({ port: PORT }, (err, address) => {
if (err) {
console.error(err);
``````

And it works great:

``````
❯ xh --headers http://localhost:3000/index.html
HTTP/1.1 200 OK
Accept-Ranges: bytes
Cache-Control: public, max-age=0
Connection: keep-alive
Content-Length: 889
Content-Type: text/html; charset=UTF-8
Date: Sat, 09 Sep 2023 03:04:15 GMT
Etag: W/"379-18a77e4e346"
Keep-Alive: timeout=72
Last-Modified: Sat, 09 Sep 2023 03:03:23 GMT
``````

Quick benchmark of serving the HTML file

Bun

``````
❯ hey -n 10000 -c 100 http://localhost:3000/index.html

Summary:
Total:    0.6408 secs
Slowest:  0.0160 secs
Fastest:  0.0001 secs
Average:  0.0063 secs
Requests/sec: 15605.9735

Total data:   8890000 bytes
Size/request: 889 bytes
``````

Node + Fastify

``````
❯ hey -n 10000 -c 100 http://localhost:3000/index.html

Summary:
Total:    1.5473 secs
Slowest:  0.0272 secs
Fastest:  0.0078 secs
Average:  0.0154 secs
Requests/sec: 6462.9597

Total data:   8890000 bytes
Size/request: 889 bytes
``````

Again, a 2x performance win for Bun.

Conclusion

There isn't much to conclude here. Just an intro to the beauty of how quick Bun is, both in terms of developer experience and raw performance.
What I admire about Bun being such a convenient bundle is that Python'esque feeling of simplicity and minimalism. (For example `python3.11 -m http.server -d src 3000` will make `http://localhost:3000/index.html` work)

The basic boilerplate of Node with Fastify + TypeScript + `nodemon` + `ts-node` is a great one if you're not ready to make the leap to Bun. I would certainly use it again. Fastify might not be the fastest server in the Node ecosystem, but it's good enough.

What's not shown in this little intro blog post, and is perhaps a silly thing to focus on, is the speed with which you type `bun --hot src/index.ts` and the server is ready to go. It's as far as human perception goes instant. The `npm run dev` on the other hand has this ~3 second "lag". Not everyone cares about that, but I do. It's more of an ethos. It's that wonderful feeling that you don't pause your thinking.

It's hard to see when I press the Enter key but compare that to Bun:

UPDATE (Sep 11, 2023)

I found this: github.com/SaltyAom/bun-http-framework-benchmark
It's a much better benchmark than mine here. Mind you, as long as you're not using something horribly slow, and you're not doing any I/O the HTTP framework performances don't matter much.

Previous page
Next page