Public Feed
Discover recent notes and bookmarks from the community
Want to share your thoughts? Sign in or create an account
uNetworking/uWebSockets.js: μWebSockets for Node.js back-ends :metal:
µWebSockets.js is a standards compliant web server written in 10,000 lines of C++. It is exposed to Node.js as a simple-to-use, native V8 addon and performs at least 10x that of Socket.IO, 8.5x that of Fastify. It makes up the core components of Bun and is the fastest standards compliant web server in the TechEmpower (not endorsed) benchmarks.
Web Frameworks Benchmark
Top photos showing effects of rapid climate change in 2025 | AP News
Tropical storms pummeled the land and ravaged ecosystems. Floodwaters engulfed streets and left cars stuck in mud. And fires scorched trees and consumed houses.
Test note
totally blowing up Scott’s database. …
Online 3D Printing Service | Custom 3D Printed Parts - JLC3DP
Onshape | Product Development Platform
3D Modeling.
Carvera Air: A Smart and Affordable Desktop CNC Machine – Makera
I Built the FIRST VENTURI ROCKET ENGINE - YouTube
ARR Club: Track & Compare Startup Revenue, Valuation & Growth
Ford’s next F-150 Lightning will have a gas generator as it pivots away from large EVs | TechCrunch
Ford is ending production of the fully-electric F-150 Lightning as part of a broader companywide shakeup of its electric vehicle plans, the company announced Monday. In its place, Ford will sell what’s known as an “extended range electric vehicle” version of the truck, which adds a gas generator that can recharge the battery pack to power the motors for over 700 miles.
Ford revealed the F-150 Lightning in 2021, two years after it first announced plans for an all-electric Mustang, the Mach-E. Ford teased a $40,000 price tag for the Lightning, which was meant to be a flagship product for the company’s $22 billion push into electric vehicles. Like most large electric trucks, though, the F-150 Lightning struggled in the U.S. market. Part of that was because the $40,000 price tag never materialized for most buyers, as that base trim was targeted specifically at fleet customers. Ford wound up selling around 7,000 Lightnings per quarter over the last two years, with a peak of nearly 11,000 in the fourth quarter of 2024. EVs have faced a lot of headwind since the F-150 Lightning was first introduced. Tesla kicked off a dramatic price war to counter falling sales, which ate into legacy automakers’ thin (or negative) margins. The reelection of Donald Trump, along with Republicans taking control of Congress, has led to a reversal of many Biden-era policies meant to encourage the sale of electric vehicles.
hiyouga/LLaMA-Factory: Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024)
Easily fine-tune 100+ large language models with zero-code CLI and Web UI
6476_SWE_bench_Can_Language_Mo.pdf
we in- troduce SWE-bench, an evaluation framework consisting of 2,294 software engineering problems drawn from real GitHub issues and corresponding pull requests across 12 popular Python repositories. Given a codebase along with a description of an issue to be resolved, a language model is tasked with editing the codebase to address the issue. Resolving issues in SWE-bench frequently requires under standing and coordinating changes across multiple functions, classes, and even files simultaneously, calling for models to interact with execution environments, process extremely long contexts and perform complex reasoning that goes far beyond traditional code generation tasks.
Unsloth AI - Open Source Fine-tuning & RL for LLMs
Easy to use, well documented fine-tuning. NVIDIA optimized with AMD support and Apple M support in the works.
Roomba maker iRobot files for bankruptcy, pursues manufacturer buyout | Reuters
New U.S. tariffs have also harmed the company, especially a 46% levy on imports from Vietnam, where iRobot manufactures vacuum cleaners for the U.S. market. The tariffs raised the company’s costs by $23 million in 2025, while making it more difficult to plan for the future, according to iRobot's court filings.
Monitoring Kubernetes pod performance metrics | Datadog
Tires & chains | WSDOT
EthStaker Community
Esusu, platform for renters to build credit, valued at $1.2 billion
While on-time mortgage payments are known to increase one’s credit score, many renters don’t have any history of credit. Esusu reports on time rent payments to credit bureaus so renters can build their scores. Over 50 million Americans lack a credit history with the three major credit bureaus: Experian, Equifax and TransUnion.
Introducing Nested Learning: A new ML paradigm for continual learning
continually updating a model's parameters with new data, often leads to “catastrophic forgetting” (CF), where learning new tasks sacrifices proficiency on old tasks. Researchers traditionally combat CF through architectural tweaks or better optimization rules. However, for too long, we have treated the model's architecture (the network structure) and the optimization algorithm (the training rule) as two separate things, which prevents us from achieving a truly unified, efficient learning system.
By defining an update frequency rate, i.e., how often each component's weights are adjusted, we can order these interconnected optimization problems into "levels." This ordered set forms the heart of the Nested Learning paradigm.
We observed that many standard optimizers rely on simple dot-product similarity (a measure of how alike two vectors are by calculating the sum of the products of their corresponding components) whose update doesn't account for how different data samples relate to each other. By changing the underlying objective of the optimizer to a more standard loss metric, such as L2 regression loss (a common loss function in regression tasks that quantifies the error by summing the squares of the differences between predicted and true values), we derive new formulations for core concepts like momentum, making them more resilient to imperfect data.
In a standard Transformer, the sequence model acts as a short-term memory, holding the immediate context, while the feedforward neural networks act as long-term memory, storing pre-training knowledge. The Nested Learning paradigm extends this concept into what we call a “continuum memory system” (CMS), where memory is seen as a spectrum of modules, each updating at a different, specific frequency rate. This creates a much richer and more effective memory system for continual learning.
"Nested Learning" extends the traditional two-tier memory concept of "attention layers" (short-term memory / context window) and "feed-forward network layers" (long term memory) into a spectrum of modules that update at different rates, some very frequently (like attention), some rarely (like FFNs), and others at various points in between.