Home>Article>Web Front-end> Why use Node.js? When can I use Node.js?
The increasing development of JavaScript has brought about many changes, and the face of web development today has become completely different. A few years ago it would have been unimaginable to run JavaScript on a server.
Before delving intoNode.js, you may want to understand the benefits of usingcross-stack JavaScript, which unifies the language and data format (JSON), Allows you to reuse developer resources in the best possible way. Incorporating Node.js into the technology stack is a key advantage.
Node.js is a JavaScript runtime environment built on Chrome's JavaScript engine called V8. It is worth noting that Ryan Dahl, the creator of Node.js, was "inspired by applications such as Gmail" and his goal was to develop a website withreal-time push functionality. In Node.js, it provides a facility for handling non-blocking event-driven I/O. [Video tutorial recommendation:nodejs tutorial]
To sum it up in one sentence: Node.js shines in real-time web applications based on websockets push technology. After more than 20 years of using stateless web applications based on the stateless request-response pattern, we now finally have web applications capable of real-time bidirectional connections, where both the client and the server can initiate communication and allow them to freely Exchange data.
This is in sharp contrast to the typical Web response model where communication is always initiated by the client. In addition, it is also based on the open web technology stack (HTML, CSS and JS) running on the standard port 80.
One could argue that we've been doing this for years in the form of Flash and Java Applets - but really, these were just sandboxed environments that used the web as the transport protocol to get data to the client. Additionally, they run in isolation, often on non-standard ports, which may require additional permissions.
With its strengths, Node.js plays a key role in the technology stack of many well-known companies that rely on its unique advantages. The Node.js Foundation has put together just about all of the best ideas, and a short PPT on why businesses should consider Node.js can be found on the Node.js Foundation’s case studies page.In this article, I will discuss not only how to use these advantages, but also
whyyou might want to use Node.js, using some classic web application models as examples .How does it work?
This is a mouthful to read.
This means that Node.js is
nota new, solve-it-all platform that is about to dominate the web development world.Instead, it is a platform that meets specific needs.Understanding this is absolutely necessary. You never want to use Node.js for CPU-intensive operations; in fact, using it for a lot of heavy computing will eliminate almost all of its advantages. Where Node.js really shines is in building fast, scalable web applications because of its ability to handle large numbers of concurrent connections at high throughput, which equates to high scalability.The underlying working principle is very interesting. Traditional web service technology spawns a new thread for each connection (request), occupying system memory and is ultimately limited by the maximum memory available, while Node.js runs on a single thread, using non-blocking I/O calls, allowing it Supports tens of thousands of concurrent connections (maintained in
event loop
Quick calculation: Assuming that each thread requires 2 MB of memory, running on a system with 8 GB of memory can theoretically have up to 4000 concurrent connections(Calculation from Michael Abernethy's article "Just what is Node .js?", published on IBM developerWorks in 2011; unfortunately, the link to this article is now dead), and this does not include the cost ofcontext switching between threads. This is the scenario you typically deal with in traditional web server technology. By avoiding all these issues, Node.js achieves levels of over1M concurrent connections, and600k concurrent websockets connections.
Of course, a potential pitfall of writing Node.js applications is the problem of sharing a single thread between client requests. First, heavy computations can block a Node's single thread and causeallclients problems (more on this later), since incoming requests will be blocked until the computation is complete. Secondly, developers need to bevery carefulnot to let exceptions bubble up to the core (top-most) Node.js event loop, which will cause the Node.js instance to terminate (program crash).
To avoid exceptions from bubbling up to the top level, a common technique is to pass errors back to the caller as callback parameters (rather than throwing them as in other environments). Even if some unhandled exception bubbles up to the top level, there are tools to monitor the Node.js process and perform the necessaryrecovery from crashes(although recovery to the current state of the user session may not be possible), most commonlyForever Module.
One thing that definitely shouldn’t be overlooked when discussing Node.js is the support for package management using the built-in npm tool, by default every Node.js environment will be installed. The concept of npm modules is very similar toRuby Gems: a set of reusable components that can be easily installed through an online repository, with version and dependency management.
The complete list of packaged modules can be found on thenpm websiteor can be accessed using the npm CLI tool that is automatically installed with Node.js. The module ecosystem is open to everyone, anyone can publish their own modules and the published modules will appear in the npm repository. For an introduction to npm, see theBeginner's Guide, and thenpm Publishing Tutorial's section on publishing modules.
Some useful npm modules are:
The list keeps growing. There are many useful packages out there for everyone to use.
Online chat is the most typical real-time multi-user application and the best case of Node.js: it is a lightweight , high-traffic, data-intensive (but low processing and computation) applications that run distributedly across devices. It's also a great study case because it's simple but covers most of the paradigms you'd use in a typical Node.js program.
Let's try to picture how it works.
Assume the simplest scenario, there is a chat room on our website, people can exchange messages in a one-to-many (actually to everyone) way.
On the server side, we have a simpleExpress.jsprogram that implements two things: 1) a GET request handler, which provides a message board and user interface The functionality of the "Send" button for initializing new message input, and 2) the websockets server for listening for new messages from websocket clients.
On the client side, we have an HTML page with several handlers set up, one for the click event of the "Send" button, which receives the input message and sends it to the websocket, and another with Used to listen for new incoming messages and display them on the websockets client (i.e. messages sent by other users that the server wants the client to display).
When one of the clients posts a message, the following happens:
This is the simplest example. For a more robust solution, you can use a simple Redis-based cache. Or in a more advanced solution, a message queue can be used as a message route, and a more powerful delivery mechanism can be implemented, such as storing messages when the connection is lost or the client is offline. But no matter what improvements you make, Node.js will still run on the same basic principles: react to events, handle many concurrent connections, and keep the user experience smooth.
While Node.js is certainly great for developing real-time applications, it is also great for exposing data from object databases such as MongoDB. JSON stored data allows Node.js to work well with objects that are consistent with the stored data and without data conversion.
For example, if you are using Rails, then you need to convert from JSON to binary models, and then convert them to JSON over HTTP for use in React.js or Angular.js, or even with simple jQuery AJAX calls. With Node.js, you can directly expose your JSON objects for client consumption via a REST API. Additionally, you don't need to worry about converting between JSON and anything else when reading from or writing to the database (if you're using MongoDB). In short, using a unified data serialization format in the client, server and database can avoid the trouble of multiple conversions.
If you receive a lot of concurrent data, your database may become a bottleneck. As mentioned above, Node.js can easily handle concurrent connections on its own. But because database access is a blocking operation (in this case), we run into trouble. The solution is to confirm the client's behavior before the data is actually written to the database.
With this approach, the system can maintain its responsiveness under high load, which is especially useful when the client does not need to confirm that the data was successfully written. Typical examples include: batching when logging or writing user tracking data; andEventual consistency(often used in the NoSQL world) Acceptable operations that do not require immediate reflection (such as updating Facebook "Likes" count).
Data is queued through some kind of cache or message queue (e.g., RabbitMQ, ZeroMQ) and passed through a separate batch write process to the database, or digested by a compute-intensive backend service and written to better A platform capable of performing such tasks. Similar behavior can be implemented in other languages or frameworks, but not on the same hardware to maintain the same high throughput.
In short: using Node, you can write database writes to one place and process them later as if they had been successfully processed.
In more traditional web platforms, HTTP requests and responses are viewed as isolated events, when in fact they are streams. You can use this property in Node.js to build some cool functionality. For example, files can be uploaded and processed at the same time. Because the data comes in through the stream, we can process it in real time. This can be used forreal-time audio and video encoding, as well as proxying between different data sources (see next section).
It's easy to use Node.js as a server-side proxy that can handle large numbers of concurrent connections in a non-blocking manner. This is particularly useful for proxies for multiple services with different response times, or for scenarios where data is collected from multiple sources.
For example, the following scenario: When the server-side program communicates with third-party resources, it will extract data from different sources, or store resources such as images and videos on third-party cloud services.
Although there are dedicated proxy servers, if you don't have an underlying proxy infrastructure, or you need a local development environment, Node may be helpful to you.
Let’s go back to the application. Another example that can be easily replaced with a real-time web solution is a stockbroker's trading software, which is used to track stock prices, perform calculations, technical analysis, and create charts.
Brokers will be able to easily switch workstations or workplaces if they switch to a live, web-based solution. Soon, we may start seeing them on Florida beaches...
Another common use case where Node-with-web -socket is perfect for: tracking website visitors and providing real-time visualization of their interactions. You can collect statistics from your users in real time and even open communication channels and engage in targeted interactions with your visitors by targeting them at specific points in the funnel. This can be found here:CANDDi.
Imagine if you could understand what your visitors were doing in real time, how would you improve your business? You can now do that by using Node.js’ real-time bidirectional sockets.
In terms of infrastructure,. For example, a SaaS provider who wants to provide a service monitoring page to its users (for example,GitHub Status Page). With the Node.js event loop, we can create a powerful web-based dashboard that asynchronously checks the status of the service and uses websockets to push data to the client. The status of both internal companies and public services can be reported in real time using this technology.
Note:Do not try to build hard real-time systems (i.e. systems that require consistent response times) in Node.js. For that kind of application, Erlangmay be a better choice.
Node.js with Express.js can also create classic Web applications on the server side. There are supporters and opponents of this approach. Here are some issues to consider:
Advantages:
Disadvantages:
An alternative to CPU-intensive computing is to create a highly scalable MQ-enabled environment with back-end processing capabilities to make Node a front-end "staffer", and handle client requests asynchronously.
For example, combining Node.js Express.js with Ruby on Rails, the latter is clearly more appropriate when it comes to relational data access.
Compared to its competitors, Node.js's relational database tools are still quite primitive. Rails, on the other hand, provides out-of-the-box data access setup and database schema migration support tools, in addition to other Gems. Rails and similar frameworks have mature and provenActive RecordorData Mapperdata access layer implementations, and if you want to try to replicate these features in pure JavaScript, then good luck to you good luck.
However, if you really prefer to implement everything in JS, check outSequelizeandNode ORM2.
If you are just using Node.js as a public-facing interface while using a Rails backend to access a relational database, this is fine and not uncommon.
Node.js is not the best platform when it comes to heavy calculations. You definitely don't want to use Node.js to build aFibonacci calculation server. Typically, any CPU-intensive operation will negate any throughput benefits provided by Node through the event-driven, non-blocking I/O model, since any incoming requests will be blocked while the thread is occupied by number crunching.
As mentioned earlier, Node.js is single-threaded and uses only one CPU core. When it comes to adding concurrency on multi-core servers, the Node core team has done some work in the form of thecluster module. You can also easily run several Node.js server instances behind areverse proxy nginx.
If using a cluster, you should still put all the heavy calculations into background processes written in a more suitable environment, and have them communicate through a message queue server like RabbitMQ.
This approach has the potential to achieve very high scalability even though all your background processing may initially run on the same server. These background processing services can be easily distributed to separate worker servers without the need to configure the front-end web server load.
Of course, you can use the same approach on other platforms, but with Node.js, you get the high reqs/sec throughput we talked about because each request is a very fast and efficient small tasks.
We discussed Node.js from theory to practice, starting with its goals and ambitions, and ending with its best points and pitfalls. When people encounter problems with Node, it almost always comes down to blocking operations being the root of all evil - 99% of which are directly caused by misuse of Node.Remember: Don’t use Node.js to solve compute scaling problems. It was designed to solve I/O scaling problems, and it
did it really well.So, if your application does not contain CPU-intensive operations and does not access any blocking resources, you can take advantage of Node.js and enjoy fast, scalable web applications.
Original English address: https://medium.com/the-node-js-collection/why-the-hell-would-you-use-node-js-4b053b94ab8e【Video tutorial recommendation:nodejs video tutorial
The above is the detailed content of Why use Node.js? When can I use Node.js?. For more information, please follow other related articles on the PHP Chinese website!