Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Use selective-await to reduce 99% of JavaScript async call overhead (github.com/etherdream)
33 points by etherdream on June 24, 2022 | hide | past | favorite | 6 comments


I may sound like a dinosaur, but

  const id   = reader.u32() ?? await A

awaiting a magical A that comes from an import and that was never touched before looks very weird.


It's not just you - I regularly read a fair amount of Type/JavaScript, and that is a weird library API.

The A function resolves to a singleton instance of Promise, which is updated by the class QuickReader. It means multiple instances of QuickReader will overwrite each other. In other words, A only works with the last instance of QuickReader created. I'm not sure if that's the intended behavior, but I would have designed it differently.


This is not a problem, `await A` is called immediately after each read, so there will be no conflict.


Cool but shouldn't this be fixed at the runtime level?


Not easily possible. The issue is that the specification mandates that the resolve callbacks must be queued as Jobs and cannot be called immediately.

This fact not only limits the performance as described in the article, it can also lead to subtle bugs due to reordering of callbacks.


Is there some other way to do that except process.nextTick() and queueMicroTask()?

https://nodejs.org/api/process.html#processnexttickcallback-...

> It is very important for APIs to be either 100% synchronous or 100% asynchronous.

As I understand, this implementation sidesteps this hazard by providing separate internally coupled sync/async APIs kept in sync by convention around the critical internal and external conditionals. Is this correct? Is it just this sync/async code path convention that makes the implementation safe as opposed to a function directly returning a value or a Promise?

The convention does make it kludgy to use, but native alternatives would defeat the purpose or become the hazard mentioned in the documentation.

Have you considered creating a Babel macro for this?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: