Press "Enter" to skip to content

The Razer Lambda Tensorbook deep learning laptop is a milestone in pure Linux power

Review: If you’re looking for a laptop to use for machine learning, Jack Wallen is convinced you won’t find a better option on the market than the Lambda Tensorbook.

Image: Razer

Every so often a new technology comes along that changes the landscape of how businesses work, developers develop and data is used and manipulated. Machine learning is one such technology.

Machine learning is a branch of artificial intelligence that focuses on the use of data and algorithms to imitate the way that humans learn to gradually improve accuracy, ability and interaction. Machine learning can be applied to use cases such as:

  • Voice assistants
  • Personalized marketing
  • Fraud detection
  • Autonomous vehicles
  • Optimizing transportation
  • Behavior prediction
  • Healthcare
  • Process automation
  • Chatbots
  • Security

The thing about machine learning is that it takes considerable hardware to do the job correctly. Run a machine learning platform or app on the wrong hardware and it will drag that machine to a halt faster than you can point your finger toward the video screen and command, “Make it so.”

SEE: Power checklist: Troubleshooting hard drive failures (TechRepublic Premium)

Run that platform or app on the right hardware and it will run at warp speed to its destination.

So, when I received the RAZER Lambda Tensorbook to review, I was excited to see which category it would fall under.

I had a sneaking suspicion it would be capable of warp speed right out of the gate.

I was not wrong.

What is the Lambda Tensorbook?

Simply put, the Lambda Tensorbook is a laptop designed specifically for machine learning. It’s a brilliant combination of hardware and software that come together to create a seriously impressive platform that enables ML developers to develop and test without power to spare.

The Tensorbook starts with a GeForce RTX 3080 Max-Q 16GB GPU which can reportedly deliver model training up to 4x faster than even the Apple M1 Max and up to 10x faster than Google Colab instances.

Next comes the full Lambda Stack, which is pre-installed on Ubuntu Linux and  includes:

  • The latest NVIDIA drivers
  • PyTorch
  • Tensorflow
  • CUDA
  • cuDNN

Lambda Stack is used by Apple, Intel, Samsung, IBM, Microsoft, Amazon, Adobe, LinkedIn, Boeing, Harvard and even the U.S. Department of Defense.

The remaining hardware list looks like this:

  • VRAM–16GB GDDR6
  • CPU–Intel Core i7-11800H
  • RAM–64GB 3,200MHz DDR4
  • Storage–2TB NVMe PCIe 4.0
  • Display–165Hz 1440p 15.6″

Ports include:

  • 2 Thunderbolt 4 (USB-C)
  • HDMI 2.1
  • Lock slot
  • UHS-III SD card reader
  • 3 USB 3.2 Gen 2 Type A
  • 3.5mm headphone/mic combo jack
  • Power port

Other hardware features include the following:

  • Wi-Fi 6E
  • Bluetooth 5.2
  • 80-watt-hour LiPro battery
  • Dimensions–35.5cm x 23.5cm x 1.69cm

The weight of the Lambda Tensorbook comes in at a beefy 4.43 pounds, but this isn’t a laptop that’s exactly meant to be extremely portable. It’s heavy, but that weight brings along with it serious power.

Speaking of which …

The sheer power of the Lambda Tensorbook

One really cool thing about the Lambda Tensorbook review unit is that it came installed with a pretty cool ML application called DeepFaceLive. This application description said it’s a real-time face swap for PC streaming or video calls. I’m not going to comment on how this type of application could be abused, but it’s seriously impressive to see it work.

To launch the application, I had to open a terminal window, change in the ~/DeepFaceLive folder and issue the command:

python3 main.py run DeepFaceLive --userdata-dir ./data

When the app opens (Figure A), it uses the laptop’s camera to stream your face swapped with one chosen from the Model drop-down.

Figure A

The DeepFaceLive app running on the Lambda Tensorbook.

I was shocked at how smooth the streaming face swap video appeared. Out of the box it’s set to use the NVIDIA GeForce GPU as the computing device. After switching it to the CPU the streaming video became jerky and the laptop slowed down dramatically. Switch the device back to the GPU (from the Device drop-down in the Face swapper section) and the performance returned to normal.

Another sample application is a Jupyter Notebook created for benchmarking. Running this app opens a new notebook in the default Firefox browser, where you can scroll through the various sections to see benchmarking information (Figure B) based on various training models.

Figure B

The Jupyter Notebook testing app is impressive and informative.

Nice touches in the Lambda Tensorbook

There are a few really nice touches to be found in the Lambda Tensorbook that have less to do with machine learning and more to do with making it just a killer laptop. First, the keyboard is fantastic. It ships with purple backlight keys (Figure C) that have just the right travel and feel for those who need a responsive keyboard.

Figure C

The purple backlit keys are fantastic to use.

Next, the speakers on this thing are some of the best sounding I’ve heard on a laptop in a while. Compared with my MacBook Pro M1, the Lambda Tensorbook is a treat to listen to.

The case for this laptop is also very sleek (Figure D) with clean lines and very little to detract from a spartan and functional design. The case reminds me of my favorite Pixel Chromebook from 2015.

Figure D

The Lambda Tensorbook case is elegant and simple.

Finally, the desktop is a pretty bare-bones GNOME that defaults to a dark theme. Although I prefer lighter themes, I can understand why they’d want to go with the dark option, given how battery life will be a prime when using this beast untethered from an outlet.

Who is the Lambda Tensorbook for?

If I’ve ever experienced a more niche piece of hardware, I cannot recall what it was. The Lambda Tensorbook is niche at its best … and I mean that in all positive permutations of the phrase. This laptop is an absolute beast that shrugs off anything you throw at it (especially when you’re doing ML training with the GPU). So if machine learning and AI are your jam (and you’re willing to pony up the $3,499 for the base model), this laptop is what you want.

It’s expensive, it’s heavy, and it can get pretty hot, but when you need unrivaled mobile power for ML/AI, you’d be hard-pressed to find a better option.

Subscribe to TechRepublic’s How To Make Tech Work on YouTube for all the latest tech advice for business pros from Jack Wallen.

Source: TechRepublic