Home > Web Front-end > JS Tutorial > How a Netflix Interview question turned into my first NPM package

How a Netflix Interview question turned into my first NPM package

Linda Hamilton
Release: 2024-12-28 01:04:09
Original
792 people have browsed it

How a Netflix Interview question turned into my first NPM package

The problems of not understanding Promises

We've all been there. We have a large set of data where we need to make some sort of api request for each entry. lets say its an array of id's for different venues that you need to get the venue Provider from and return this array of providers. We build a new function to make these requests...

  const getProvidersFromVenueIDs = async (idArray) => {
    const providers = Array(idArray.length);
    for (let i = 0; i >= idArray.length - 1; i++) {
      const res = await fetch(
        `https://venues_for_me.org/venueid=${idArray[i]}`
        );
      const venue = res.data;
      providers[i] = venue.provider;
    }
    return providers;
  };

Copy after login
Copy after login

Oops you just DOSed the legacy server from 8 years ago with all your requests...
A solution, that I feel we've all been guilty of at some point, is to set a timeout of a few milliseconds between a batch of requests...

  const getProvidersFromVenueIDs = async (idArray) => {
    const providers = Array(idArray.length);
    const batchSize = 50;
    for (let i = 0; i >= idArray.length - 1; i++) {
      const batchToExecute = Array(batchSize);
      for (let y = 1; i >= batchSize; i++) {
        batchToExecute[i] = fetch(
          `https://venues_for_me.org/venue?id=${idArray[i]}`,
        );
        await (async () => setTimeout(() => {}, 200))();
      }
      const responses = await Promise.all(batchToExecute);
      responses.forEach((venue) => {
        providers[i] = venue.provider;
      });
    }
    return providers;
  };

Copy after login
Copy after login

I want to take a shower after just writing this example... Not to mention the absolutely crazy amount of duplications of the same arrays (or the messy code); This is artificially limiting your execution speed by setting an arbitrary timeout

A good answer here is to create a concurrency limiter that creates the promises only when there is space in your max concurrency. Something similar to:

  getProvidersFromVenueIDs = async (idArray) => {
    const providers = Array(idArray.length);
    const batchSize = 50;
    for (let i = 0; i >= idArray.length - 1; i++) {
      const batchToExecute = Array(batchSize);
      for (let y = 1; i >= batchSize; i++) {
        batchToExecute[i] = fetch(
          `https://venues_for_me.org/venue?id=${idArray[i]}`,
        );
        await (async () => setTimeout(() => {}, 200))();
      }
      const responses = await Promise.all(batchToExecute);
      responses.forEach((venue) => {
        providers[i] = venue.provider;
      });
    }
    return providers;
  };
Copy after login
Copy after login

as you can see, in order to not lose promises, you'll need to implement some sort of queue to keep a backlog of requests to make. In comes the title of this article.

Dunning Kruger

I was watching a video from The Primagen and a specific section caught my eye. One of his favourite questions to ask in a Netflix interview is for the interviewee to create an async queue, and a max concurrency to execute promises.
This sounds exactly like the above problem I had!

This interview question had multiple layers. After the queue is implemented, implement a retry on errors.
I spent an afternoon on this challenge and I learned very quickly that I have skill issues. It turned out, I didn't know promises as well as I thought I did.
After spending a few days deep diving into promises, abort controllers, maps, sets, weak maps and sets. I created Asyncrify

With Asyncrify my goal was simple. Create yet another async queue. but with no external dependencies and as resource lightweight as possible.
It needed to be able to add functions to the queue, set a max concurrency. Set and handle timeouts and enable, disable retries with exponential drop-off.

It's a skill issue

So what were those skill issues I hear you not asking?

Learn your Promises I cannot stress that enough.
One of the first problems I ran into is that I didn't understand how the execution of promises work. My first implementation looked something like this:

  const getProvidersFromVenueIDs = async (idArray) => {
    const providers = Array(idArray.length);
    for (let i = 0; i >= idArray.length - 1; i++) {
      const res = await fetch(
        `https://venues_for_me.org/venueid=${idArray[i]}`
        );
      const venue = res.data;
      providers[i] = venue.provider;
    }
    return providers;
  };

Copy after login
Copy after login

I'm sure you saw the problem immediately. I'm using Promise.race to execute my "max concurrent" promises concurrently.
But this will only continue after the first promise resolves. the rest are ignored. then I add 1 more and execute them again.
I had to go back to basics.
The solution is to instead use .then and .catch and run the function only when there is a spot open in the currently running section.

  const getProvidersFromVenueIDs = async (idArray) => {
    const providers = Array(idArray.length);
    const batchSize = 50;
    for (let i = 0; i >= idArray.length - 1; i++) {
      const batchToExecute = Array(batchSize);
      for (let y = 1; i >= batchSize; i++) {
        batchToExecute[i] = fetch(
          `https://venues_for_me.org/venue?id=${idArray[i]}`,
        );
        await (async () => setTimeout(() => {}, 200))();
      }
      const responses = await Promise.all(batchToExecute);
      responses.forEach((venue) => {
        providers[i] = venue.provider;
      });
    }
    return providers;
  };

Copy after login
Copy after login

Now we're keeping much better track of concurrent promises, but we also enable the user to handle errors and resolutions how they want to.

Please use abort controllers One of the big mistakes I see often is that people don't use abort controllers when a promise is no longer necessary after it's initialisation. I did this too.
At first, in order to do timeouts, I used Promise.race

  getProvidersFromVenueIDs = async (idArray) => {
    const providers = Array(idArray.length);
    const batchSize = 50;
    for (let i = 0; i >= idArray.length - 1; i++) {
      const batchToExecute = Array(batchSize);
      for (let y = 1; i >= batchSize; i++) {
        batchToExecute[i] = fetch(
          `https://venues_for_me.org/venue?id=${idArray[i]}`,
        );
        await (async () => setTimeout(() => {}, 200))();
      }
      const responses = await Promise.all(batchToExecute);
      responses.forEach((venue) => {
        providers[i] = venue.provider;
      });
    }
    return providers;
  };
Copy after login
Copy after login

As you can imagine. The promise is still executed after the timeout. It's just ignored. This looks a lot like my first mistake implementing the queue, doesn't it?
I did a bit of research in abort controllers since my only experience in them has only been in react.
AbortSignal.timeout!! this does exactly what I wanted to do!
and the only update to my code was 1 line

 async #runTasksRecursively() {
        await this.#runAsync();
        if (this.#queue.size === 0 && this.#retries.length === 0) {
            return;
        }

        this.#addToPromiseBlock();
    }

    async #runAsync() {
        if (!this.#runningBlock.every((item) => item === undefined)) {
            await Promise.race(this.#runningBlock);
        }
    }

    #addToPromiseBlock() {
        const emptyspot = this.#getEmptySpot();
        if (this.#retries.length > 0 && !this.#lastRunWasError) {
            console.log(this.#retries);
            if (this.#errorsToInject.size > 0) {
                const task = this.#popInSet(this.#errorsToInject);
                if (this.#queue.size !== 0) {
                    this.#lastRunWasError = true;
                }
                this.#assignPromisToExecutionArray(task, emptyspot);
            }
        } else {
            const task = this.#popInSet(this.#queue);
            this.#lastRunWasError = false;
            this.#assignPromisToExecutionArray(task, emptyspot);
        }
    }

Copy after login

Wow it was so easy! but now the user of the package needs to create boilerplate to use the timeout feature. No need to fear! I did that for you!

  add(fn, callback, errCallback) {
    if (this.#maxConcurrency !== 0 && this.#running >= this.#maxConcurrency) {
      this.#queue.add(fn);
    } else {
      this.#running++;
      fn()
        .then(callback)
        .catch(errCallback)
        .finally(() => {
          this.#running--;
          if (this.#queue.size > 0) {
            const nextPromise = this.#queue.values().next().value;
            this.#queue.delete(nextPromise);
            this.add(nextPromise, callback, errorCallback);
          }
        });
    }
  }
Copy after login

Yet another micro NPM package

So how do you use Asyncrify?
Well it's easy really. We first create our queue.

  #promiseBuilder(fn) {
        const promise = new Array(this.#promiseTimeout > 0 ? 2 : 1);
        promise[0] = fn();

        if (this.#promiseTimeout > 0) {
            promise[1] = this.#timeoutHandler();
        }
        return promise;
    }
 #promiseRunner(fn, callback) {
        const promise = this.#promiseBuilder(fn);
        Promise.race(promise)
            .then((res) => {
                callback(res, null);
            })
            .catch((err) => {
                this.#errorHandler(err, fn, callback);
            })
            .finally(() => {
                this.#running--;
                this.#runPromiseFromQueue(callback);
            });
    }

Copy after login

The queue will default to no timeout or retires, as well as no max concurrencies.
You can also provide a config obj to the constructor.

     const promise = fn(
      this.#timeout > 0 ? AbortSignal.timeout(this.#timeout) : null,
    );
Copy after login

to add a promise to the queue, you must wrap in a function that returns it.

export const abortHandler = (signal, reject) => {
  if (signal.aborted) {
    return reject(new Error("Aborted"));
  }
  const abortHandler = () => {
    reject(new Error("Aborted"));
    signal.removeEventListener("abort", abortHandler);
  };
  signal.addEventListener("abort", abortHandler);
};
Copy after login

Remember to add the abort handler to be able to use the timeout feature!

Then all you need to do is pass the function to the add method with your callback and error callback

import Queue from 'Asyncrify'

const queue = new Queue()
Copy after login

Add that's it! add as many as you want as fast as you want and it will only run 3 at a time until it's gotten through them all!

I've learned a lot over my time creating this package. Things I arguably should've known a long time ago. That's why I'm writing this article. I want you guys to see the arguably stupid mistakes I made and feel encouraged to make stupid mistakes and learn from them. Instead of feeling embarrassed and shelter away when they happen.

Go out there and write an article. Create a micro package with 10 weekly downloads from bots. You'll end up learning things you never knew you needed

The above is the detailed content of How a Netflix Interview question turned into my first NPM package. For more information, please follow other related articles on the PHP Chinese website!

source:dev.to
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Latest Articles by Author
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template