PAID2STREAM

Technology Stack

power / speed / safety / scalability

Litespeed reverse proxy

Go big or go home! Webservers need to handle big data and when it comes down to it, there is only one technology that large companies wish they knew about before they built their platforms. Using a lower level integrated object caching system with Paid2Stream we achieve massive performance right from the start. As a developer who has spent years meticulously optimizing every single configuration file to squeeze performance on a system, there is no going back!

HTTP/2 Test Results

$: h2load -n 10000 -c 10 -t 1 -T 5 -m 10 -H 'Accept-Encoding: gzip,deflate' https://localhost/example

Server Requests / Sec MB / Sec Failures Header Compression
LiteSpeed 55647 233.26 0 96.6%
Nginx 130.6 0.61 0 23.59%
Apache 109.1 0.48 0 78.54%
  • Please note: We usually run the h2load test with -c 100, but in this case there were too many errors occurring with nginx and Apache. We reduced -c to 10 for this reason.

  • Test shows LiteSpeed Web Server is 426X faster than Nginx and 510X faster than Apache.

Testing Environment

Network: Traffic: 8.70 Gbits/sec Latency: 0.358 ms

Servers tested:

LiteSpeed Web Server v5.4.1
nginx v1.16.1
Apache v2.4.41.

P2S-JBase :

Version: 3.9.11

LSCache: LiteSpeed Cache

nginx Cache: System Page Cache

Apache Cache: System Page Cache

Client Machine

Memory Size: 991.09MB

CPU number: 1 CPU

Model: Virtual CPU 6db7dc0e7704

Disk: NVMe SSD

Server Machine

Memory Size: 991.09MB

CPU number: 1 CPU

Model: Virtual CPU 6db7dc0e7704

Disk: NVMe SSD

Lets talk REDIS!

Feature-rich, networked, in-memory query caching. Helping prevent database overloading and gain even more performance not only caching DB queries in memory but also handleling user sessions. With very fast non-blocking first synchronization, auto-reconnection partial resynchronization net splitting.

RUST Programming

We run core processes with Rust to get raw performance on the most resource hungry operations. Rust is blazingly fast and memory-efficient: with no runtime or garbage collector, it can power performance-critical services, run on embedded devices, and easily integrate with other languages. Rust’s rich type system and ownership model guarantee memory-safety and thread-safety — enabling you to eliminate many classes of bugs at compile-time.

What we run on rust

Our streaming platform protocol is built in RUST! While our initial 3 implementations with different technologies were very promising, we have gained over 60% of CPU resources.

Our live chat protocol is also built in RUST! We initially began with a WEBRTC peer to peer approach which had its own limitations for large scale group chat. So we had to innovate and we relied on the raw power and performance of RUST!

CPU and memory consumption is not our only victory with RUST, but so is our P2P peer to peer assisted delivery streaming tracker! We save up to 70% or more on bandwith usage using webtorrent technology, and RUST does all the heavy lifting!