Skip to content

Conversation

@posborne
Copy link
Collaborator

  • Add spidermonkey-json benchmark using JSON.parse/stringify
  • Add SpiderMonkey Regex benchmark
  • Scaled all spidermonkey benchmarks to take around ~100M cycles per iteration on my test machine which is hopefully a balanced size.
  • Rename base spidermonkey benchmark to markdown

--

The new benchmarks are pretty straightforward; the tricker bit here was figuring out an approach I was reasonbly happy with for bundling js inputs into the artifact. In the end, I used the same hex encoded contents in header files approach but with a top-level header pulling in the js code bits.

Adding new JS benchmarks should be fairly simple now with just adding a new directory under benchmarks/spidermnokey/js/ with main.js and other files and then providing the expected named input/stdout/stderr files based on the pattern.

I did consider switching things over to starlingmonkey but decided to leave that alone for now as it didn't appear it would be a straightforward migration and probably doesn't have too much value in practice for what insights we're looking to get from sightglass.

CC @zkat / @TartanLlama

- Add spidermonkey-json benchmark using JSON.parse/stringify
- Add SpiderMonkey Regex benchmark
- Scaled all spidermonkey benchmarks to take aroun ~100M cycles
  per iteration on my test machine which is hopefully a balanced
  size.
- Rename base spidermonkey benchmark to markdwon
@posborne posborne requested a review from fitzgen December 23, 2025 18:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant