The easiest way to write Fastly Compute services in Zig.
Fastly Compute is Fastly's service to run custom code directly on CDN nodes.
The service runs anything that can be compiled to WebAssembly, and exports a convenient set of functions to interact with the platform.
Zigly is a library that makes it easy to write Fastly Compute modules in Zig.
Beyond the functions exported by the Fastly platform, Zigly will eventually include additional utility functions (cookie manipulation, JWT tokens, tracing...) to make application development as simple as possible.
Zigly is written for Zig 0.12.x.
Check out the example
directory.
This contains an example Fastly application that relays all incoming traffic to a backend server, with transparent caching.
If you just want to use Fastly as a CDN, this is all you need!
Add the dependency to your project:
zig fetch --save=zigly https://github.com/jedisct1/zigly/archive/refs/tags/0.1.8.tar.gz
And the following to your build.zig
file:
const zigly = b.dependency("zigly", .{
.target = target,
.optimize = optimize,
});
exe.root_module.addImport("zigly", zigly.module("zigly"));
exe.linkLibrary(zigly.artifact("zigly"));
The zigly
structure can be imported in your application with:
const zigly = @import("zigly");
const std = @import("std");
pub fn main() !void
std.debug.print("Hello from WebAssembly and Zig!\n", .{});
}
The program can be compiled with (replace example.zig
with the source file name):
zig build-exe -target wasm32-wasi example.zig
Happy with the result? Add -Doptimize=ReleaseSmall
or -Doptimize=ReleaseFast
to get very small or very fast module:
zig build-exe -target wasm32-wasi -Doptimize=ReleaseSmall example.zig
The example above should not compile to more than 411 bytes.
If you are using a build file instead, define the target as wasm32-wasi
in the build.zig
file:
const target = b.standardTargetOptions(.{ .default_target = .{ .cpu_arch = .wasm32, .os_tag = .wasi } });
...and build with zig build -Doptimize=ReleaseSmall
or -Doptimize=ReleaseFast
to get optimized modules.
The easiest way to test the resulting modules is to use Viceroy, a reimplementation of the Fastly API that runs locally.
const downstream = try zigly.downstream();
var response = downstream.response;
try response.body.writeAll("Hello world!");
try response.finish();
downstream()
returns a type representing the initial connection, from a client to the proxy.
That type includes response
, that can be used to send a response, as well as request
, that can be used to inspect the incoming request.
Every function call may fail with an error from the FastlyError
set.
Slightly more complicated example:
const downstream = try zigly.downstream();
var response = downstream.response;
response.setStatus(201);
response.headers.set("X-Example", "Header");
try response.body.writeAll("Partial");
try response.flush();
try response.body.writeAll("Response");
try response.finish();
var logger = Logger.open("logging_endpoint");
logger.write("Operation sucessful!");
Note that calling finish()
is always required in order to actually send a response to the client.
But realistically, most responses will either be simple redirects:
var downstream = try zigly.downstream();
try downstream.redirect(302, "https://www.perdu.com");
or responding directly from the cache, proxying to the origin if the cached entry is nonexistent or expired:
var downstream = try zigly.downstream();
try downstream.proxy("google", "www.google.com");
Applications can read the body of an incoming requests as well as other informations such as the headers:
const request = downstream.request;
const user_agent = try request.headers.get(allocator, "user-agent");
if (request.isPost()) {
// method is POST, read the body until the end, up to 1000000 bytes
const body = try request.body.readAll(allocator, 1000000);
}
As usual in Zig, memory allocations are never hidden, and applications can choose the allocator they want to use for individual function calls.
Making HTTP queries is easy:
var query = try zigly.Request.new("GET", "https://example.com");
var response = try query.send("backend");
const body = try response.body.readAll(allocator, 0);
Arbitrary headers can be added the the outgoing query
:
try query.headers.set("X-Custom-Header", "Custom value");
Body content can also be pushed, even as chunks:
try query.body.write("X");
try query.body.write("Y");
try query.body.close();
And the resulting response
contains headers
and body
properties, that can be inspected the same way as a downstream query.
Caching can be disabled or configured on a per-query basis with setCachingPolicy()
:
try query.setCachingPolicy(.{ .serve_stale = 600, .pci = true });
Attributes include:
no_cache
ttl
serve_stale
pci
surrogate_key
With pipe()
, the response sent to a client can be a direct copy of another response. The application will then act as a proxy, optionally also copying the original status and headers.
var query = try zigly.Request.new("GET", "https://google.com");
var upstream_response = try query.send("google");
const downstream = try zigly.downstream();
try downstream.response.pipe(&upstream_response, true, true);
Proxying is even easier to use than pipes when a query should be sent unmodified (with the exception of the Host
header) to the origin:
var downstream = try zigly.downstream();
try downstream.proxy("google", "www.google.com");
The second parameter is optional. If null
, the original Host
header will not be modified.
Redirecting the client to another address can be done with a single function call on the downstream object:
const downstream = try zigly.downstream();
try downstream.redirect(302, "https://www.perdu.com");
By default, responses are left as-is. Which means that if compression (Content-Encoding
) was accepted by the client, the response can be compressed.
Calling setAutoDecompressResponse(true)
on a Request
object configures the Fastly Compute runtime to decompress gzip-encoded responses before streaming them to the application.
const dict = try zigly.Dictionary.open("name");
const value = try dict.get(allocator, "key");
const logger = try zigly.Logger.open("endpoint);
try logger.write("Log entry");
The fastly
command-line tool only supports compilation of Rust and AssemblyScript at the moment.
However, it can still be used to upload pre-compiled code written in other languages, including Zig.
- Create a new project:
fastly compute init
For the language, select Other (pre-compiled WASM binary)
.
- Add a build script:
Add the following lines to the fastly.toml file:
[scripts]
build = "zig build -Doptimize=ReleaseSmall -Dtarget=wasm32-wasi && mkdir -p bin && cp zig-out/bin/*.wasm bin/main.wasm"
- Compile and package the Fastly Compute module:
fastly compute build
- Test locally
fastly compute serve
- Deploy!
fastly compute deploy
In order to deploy new versions, repeat steps 3 and 5.