activescott's Notes

Public notes from activescott

Tuesday, December 16, 2025

Ford is ending production of the fully-electric F-150 Lightning as part of a broader companywide shakeup of its electric vehicle plans, the company announced Monday. In its place, Ford will sell what’s known as an “extended range electric vehicle” version of the truck, which adds a gas generator that can recharge the battery pack to power the motors for over 700 miles.

Ford revealed the F-150 Lightning in 2021, two years after it first announced plans for an all-electric Mustang, the Mach-E. Ford teased a $40,000 price tag for the Lightning, which was meant to be a flagship product for the company’s $22 billion push into electric vehicles. Like most large electric trucks, though, the F-150 Lightning struggled in the U.S. market. Part of that was because the $40,000 price tag never materialized for most buyers, as that base trim was targeted specifically at fleet customers. Ford wound up selling around 7,000 Lightnings per quarter over the last two years, with a peak of nearly 11,000 in the fourth quarter of 2024. EVs have faced a lot of headwind since the F-150 Lightning was first introduced. Tesla kicked off a dramatic price war to counter falling sales, which ate into legacy automakers’ thin (or negative) margins. The reelection of Donald Trump, along with Republicans taking control of Congress, has led to a reversal of many Biden-era policies meant to encourage the sale of electric vehicles.

we in- troduce SWE-bench, an evaluation framework consisting of 2,294 software engineering problems drawn from real GitHub issues and corresponding pull requests across 12 popular Python repositories. Given a codebase along with a description of an issue to be resolved, a language model is tasked with editing the codebase to address the issue. Resolving issues in SWE-bench frequently requires under standing and coordinating changes across multiple functions, classes, and even files simultaneously, calling for models to interact with execution environments, process extremely long contexts and perform complex reasoning that goes far beyond traditional code generation tasks.

Monday, December 15, 2025

Sunday, December 14, 2025

continually updating a model's parameters with new data, often leads to “catastrophic forgetting” (CF), where learning new tasks sacrifices proficiency on old tasks. Researchers traditionally combat CF through architectural tweaks or better optimization rules. However, for too long, we have treated the model's architecture (the network structure) and the optimization algorithm (the training rule) as two separate things, which prevents us from achieving a truly unified, efficient learning system.

By defining an update frequency rate, i.e., how often each component's weights are adjusted, we can order these interconnected optimization problems into "levels." This ordered set forms the heart of the Nested Learning paradigm.

We observed that many standard optimizers rely on simple dot-product similarity (a measure of how alike two vectors are by calculating the sum of the products of their corresponding components) whose update doesn't account for how different data samples relate to each other. By changing the underlying objective of the optimizer to a more standard loss metric, such as L2 regression loss (a common loss function in regression tasks that quantifies the error by summing the squares of the differences between predicted and true values), we derive new formulations for core concepts like momentum, making them more resilient to imperfect data.

In a standard Transformer, the sequence model acts as a short-term memory, holding the immediate context, while the feedforward neural networks act as long-term memory, storing pre-training knowledge. The Nested Learning paradigm extends this concept into what we call a “continuum memory system” (CMS), where memory is seen as a spectrum of modules, each updating at a different, specific frequency rate. This creates a much richer and more effective memory system for continual learning.

"Nested Learning" extends the traditional two-tier memory concept of "attention layers" (short-term memory / context window) and "feed-forward network layers" (long term memory) into a spectrum of modules that update at different rates, some very frequently (like attention), some rarely (like FFNs), and others at various points in between.

That’s the New York Times, CNN, CNBC, NBC, and the Guardian all confidently telling their readers that Trump can magically override state sovereignty with a memo. These aren’t fringe blogs—these are supposedly serious news organizations with actual editors who apparently skipped the day they taught how the federal government works. They have failed the most simple journalistic test of “don’t print lies in the newspaper.”

Executive orders aren’t laws. They’re memos. Fancy, official memos that tell federal employees how to do their jobs, but memos nonetheless. You want to change what states can and can’t do? You need this little thing called “Congress” to pass this other little thing called “legislation.” Trump can’t just declare state laws invalid any more than he can declare himself emperor of Mars.

But here’s where this gets kinda funny (in a stupid way): that “interstate commerce” language could backfire spectacularly. Almost all state laws trying to regulate the internet—from child safety laws to age verification to the various attempts at content moderation laws—might run afoul of the dormant commerce clause by attempting to regulate interstate commerce if what the admin here claims is true (it’s not really true, but if the Supreme Court buys it…). Courts had been hesitant to use this nuclear option because it would essentially wipe out the entire patchwork of state internet regulation that’s been building for years, and a few decades of work in other areas that hasn’t really been challenged. Also, because they’ve mostly been able to invalidate those laws using the simple and straightforward First Amendment.

The real story here isn’t that Trump signed some groundbreaking AI policy—it’s that the entire mainstream media apparatus completely failed to understand the most basic principles of American government. Executive orders aren’t magic spells that override federalism. They’re memos.

Saturday, December 13, 2025

Friday, December 12, 2025

Thursday, December 11, 2025

The measles, mumps and rubella vaccine is safe and provides 97% protection against the disease after two doses. Most children in the U.S. are required to get the shot to attend school. But vaccination rates have declined as more parents waive the shots or have fallen behind on recommended vaccination schedules.

Wednesday, December 10, 2025

Starting today, self-service access to Reddit’s public data API will be closed. Anyone looking to build with Reddit data, whether you’re a developer, researcher, or moderator, will need to request approval before gaining access.

WTF?

#