3550 stories
·
2 followers

Passkeys: A Shattered Dream

2 Shares

At around 11pm last night my partner went to change our lounge room lights with our home light control system. When she tried to login, her account couldn't be accessed. Her Apple Keychain had deleted the Passkey she was using on that site.

This is just the icing on a long trail of enshittification that has undermined Webauthn. I'm over it at this point, and I think it's time to pour one out for Passkeys. The irony is not lost on me that I'm about to release a new major version of webauthn-rs today as I write this.

The Dream

In 2019 I flew to my mates place in Sydney and spent a week starting to write what is now the Webauthn library for Rust. In that time I found a number of issues in the standard and contributed improvements to the Webauthn workgroup, even though it took a few years for those issues to be resolved. I started to review things and participate more.

At the time there was a lot of optimism that this technology could be the end of passwords. You had three major use cases:

  • Second Factor
  • Passwordless
  • Usernameless

Second Factor was a stepping stone toward the latter two. Passwordless is where you would still type in an account name then authenticate with PIN+Touch to your security key, and usernameless is where the identity for your account was resident (discoverable) on the key. This was (from my view) seen as a niche concept by developers since really - how hard is it for a site to have a checkbox that says "remember me"?

This library ended up with Kanidm being (to my knowledge) the very first OpenSource IDM to implement passwordless (now passkeys). The experience was wonderful. You went to Kanidm, typed in your username and then were prompted to type your PIN and touch your key. Simple, fast, easy.

For devices like your iPhone or Android, you would do similar - just use your Touch ID and you're in.

It was so easy, so accessible, I remember how it almost felt impossible. That authentication could be cryptographic in nature, but so usable and trivial for consumers. There really was the idea and goal within FIDO and Webauthn that this could be "the end of passwords".

This is what motivated me to continue to improve webauthn-rs. It's reach has gone beyond what I expected with parts of it being used in Firefox's authenticator-rs, a whole microcosm of Rust Identity Providers (IDPs) being created from this library and my work, and even other language's Webauthn implementations and password managers using our library as the reference implementation to test against. I can not understate how humbled I am of the influence webauthn-rs has had.

The Warnings

However warnings started to appear that the standard was not as open as people envisaged. The issue we have is well known - Chrome controls a huge portion of the browser market, and development is tightly controlled by Google.

An example of this was the Authenticator Selection Extension.

This extension is important for sites that have strict security requirements because they will attest the make and model of the authenticator in use. If you know that the attestation will only accept certain devices, then the browser should filter out and only allow those devices to participate.

However Chrome simply never implemented it leading to it being removed. And it was removed because Chrome never implemented it. As a result, if Chrome doesn't like something in the specification they can just veto it without consequence.

Later the justification for this not being implemented was: "We have never implemented it because we don't feel that authenticator discrimination is broadly a good thing. ... they [users] should have the expectation that a given security key will broadly work where they want to use it."

I want you to remember this quote and it's implications.

Users should be able to use any device they choose without penalty.

Now I certainly agree with this notion for general sites on the internet, but within a business where we have policy around what devices may be acceptable the ability to filter devices does matter.

This makes it very possible that you can go to a corporate site, enroll a security key and it appears to work but then it will fail to register (even better if this burns one of your resident key slots that can not be deleted without a full reset of your device) since the IDP rejected the device attestation. That's right, even without this, IDP's can still "discriminate" against devices without this extension, but the user experience is much worse, and the consequences far more severe in some cases.

The kicker is that Chrome has internal feature flags that they can use for Google's needs. They can simply enable their own magic features that control authenticator models for their policy, while everyone else has to have a lesser experience.

The greater warning here is that many of these decisions are made at "F2F" or Face to Face meetings held in the US. This excludes the majority of international participants leading some voices to be stronger than others. It's hard to convince someone when you aren't in the room, even more so when the room is in a country that has a list of travel advisories including "Violent crime is more common in the US than in Australia", "There is a persistent threat of mass casualty violence and terrorist attacks in the US" and "Medical costs in the US are extremely high. You may need to pay up-front for medical assistance". (As an aside, there are countries that have a "do not travel" warning for less, but somehow the US gets a pass ...).

The Descent

In 2022 Apple annouced Passkeys.

At the time this was just a really nice "marketing" term for passwordless, and Apple's Passkeys had the ability to oppurtunistically be usernameless. It was all in all very polished and well done.

But of course, thought leaders exist, and Apple hadn't defined what a Passkey was. One of those thought leaders took to the FIDO conference stage and announced "Passkeys are resident keys", at the same time as the unleashed a passkeys dev website (I won't link to it out of principal).

The issue is described in detail in another of my blog posts but to summarise, this push to resident keys means that security keys are excluded because they often have extremely low limits on storage, the highest being 25 for yubikeys. That simply won't cut it for most people where they have more than 25 accounts.

Now with resident keys as passkeys as users we certainly don't have the expectation that our security keys will work when we want to use them!

The Enshittocene Period

Since then Passkeys are now seen as a way to capture users and audiences into a platform. What better way to encourage long term entrapment of users then by locking all their credentials into your platform, and even better, credentials that can't be extracted or exported in any capacity.

Both Chrome and Safari will try to force you into using either hybrid (caBLE) where you scan a QR code with your phone to authenticate - you have to click through menus to use a security key. caBLE is not even a good experience, taking more than 60 seconds work in most cases. The UI is beyond obnoxious at this point. Sometimes I think the password game has a better ux.

The more egregious offender is Android, which won't even activate your security key if the website sends the set of options that are needed for Passkeys. This means the IDP gets to choose what device you enroll without your input. And of course, all the developer examples only show you the options to activate "Google Passkeys stored in Google Password Manager". After all, why would you want to use anything else?

A sobering pair of reads are the Github Passkey Beta and Github Passkey threads. There are instances of users whose security keys are not able to be enrolled as the resident key slots are filled. Multiple users describe that Android can not create Passkeys due to platform bugs. Some devices need firmware resets to create Passkeys. Keys can be saved on the client but not the server leading to duplicate account presence and credentials that don't work, or worse lead users to delete the real credentials.

The helplessness of users on these threads is obvious - and these are technical early adopters. The users we need to be advocates for changing from passwords to passkeys. If these users can't make it work how will people from other disciplines fare?

Externally there are other issues. Apple Keychain has personally wiped out all my Passkeys on three separate occasions. There are external reports we have recieved of other users who's Keychain Passkeys have been wiped just like mine.

Now as users we have the expectation that keys won't be created or they will have disappeared when we need them most.

In order to try to resolve this the workgroup seems to be doubling down on more complex JS apis to try to patch over the issues that they created in the first place. All this extra complexity comes with fragility and more bad experiences, but without resolving the core problems.

It's a mess.

The Future

At this point I think that Passkeys will fail in the hands of the general consumer population. We missed our golden chance to eliminate passwords through a desire to capture markets and promote hype.

Corporate interests have overruled good user experience once again. Just like ad-blockers, I predict that Passkeys will only be used by a small subset of the technical population, and consumers will generally reject them.

To reiterate - my partner, who is extremely intelligent, an avid computer gamer and veterinary surgeon has sworn off Passkeys because the user experience is so shit. She wants to go back to passwords.

And I'm starting to agree - a password manager gives a better experience than passkeys.

That's right. I'm here saying passwords are a better experience than passkeys. Do you know how much it pains me to write this sentence? (and yes, that means MFA with TOTP is still important for passwords that require memorisation outside of a password manager).

So do yourself a favour. Get something like bitwarden or if you like self hosting get vaultwarden. Let it generate your passwords and manage them. If you really want passkeys, put them in a password manager you control. But don't use a platform controlled passkey store, and be very careful with security keys.

And if you do want to use a security key, just use it to unlock your password manager and your email.

Within enterprise there still is a place for attested security keys where you can control the whole experience to avoid the vendor lockin parts. It still has rough edges though. Just today I found a browser that has broken attestation which is not good. You still have to dive through obnoxious UX elements that attempt to force you through caBLE even though your IDP will only accept certain security models, so you're still likely to have some confused users.

But at this point, in Kanidm we are looking into device certificates and smartcards instead. The UI is genuinely better. Which says a lot considering the PKCS11 and PIV specifications. But at least PIV won't fall prone to attempts to enshittify it.

Read the whole story
emrox
9 hours ago
reply
Hamburg, Germany
Share this story
Delete

Announcing TypeScript 5.5 Beta

2 Shares

Today we are excited to announce the availability of TypeScript 5.5 Beta.

To get started using the beta, you can get it through NuGet, or through npm with the following command:

npm install -D typescript@beta

Here’s a quick list of what’s new in TypeScript 5.5!

Inferred Type Predicates

This section was written by Dan Vanderkam, who implemented this feature in TypeScript 5.5. Thanks Dan!

TypeScript’s control flow analysis does a great job of tracking how the type of a variable changes as it moves through your code:

interface Bird {
    commonName: string;
    scientificName: string;
    sing(): void;
}

// Maps country names -> national bird.
// Not all nations have official birds (looking at you, Canada!)
declare const nationalBirds: Map<string, Bird>;

function makeNationalBirdCall(country: string) {
  const bird = nationalBirds.get(country);  // bird has a declared type of Bird | undefined
  if (bird) {
    bird.sing();  // bird has type Bird inside the if statement
  } else {
    // bird has type undefined here.
  }
}

By making you handle the undefined case, TypeScript pushes you to write more robust code.

In the past, this sort of type refinement was more difficult to apply to arrays. This would have been an error in all previous versions of TypeScript:

function makeBirdCalls(countries: string[]) {
  // birds: (Bird | undefined)[]
  const birds = countries
    .map(country => nationalBirds.get(country))
    .filter(bird => bird !== undefined);

  for (const bird of birds) {
    bird.sing();  // error: 'bird' is possibly 'undefined'.
  }
}

This code is perfectly fine: we’ve filtered all the undefined values out of the list. But TypeScript hasn’t been able to follow along.

With TypeScript 5.5, the type checker is fine with this code:

function makeBirdCalls(countries: string[]) {
  // birds: Bird[]
  const birds = countries
    .map(country => nationalBirds.get(country))
    .filter(bird => bird !== undefined);

  for (const bird of birds) {
    bird.sing();  // ok!
  }
}

Note the more precise type for birds.

This works because TypeScript now infers a type predicate for the filter function. You can see what’s going on more clearly by pulling it out into a standalone function:

// function isBirdReal(bird: Bird | undefined): bird is Bird
function isBirdReal(bird: Bird | undefined) {
  return bird !== undefined;
}

bird is Bird is the type predicate. It means that, if the function returns true, then it’s a Bird (if the function returns false then it’s undefined). The type declarations for Array.prototype.filter know about type predicates, so the net result is that you get a more precise type and the code passes the type checker.

TypeScript will infer that a function returns a type predicate if these conditions hold:

  1. The function does not have an explicit return type or type predicate annotation.
  2. The function has a single return statement and no implicit returns.
  3. The function does not mutate its parameter.
  4. The function returns a boolean expression that’s tied to a refinement on the parameter.

Generally this works how you’d expect. Here’s a few more examples of inferred type predicates:

// const isNumber: (x: unknown) => x is number
const isNumber = (x: unknown) => typeof x === 'number';

// const isNonNullish: <T>(x: T) => x is NonNullable<T>
const isNonNullish = <T,>(x: T) => x != null;

Previously, TypeScript would have just inferred that these functions return boolean. It now infers signatures with type predicates like x is number or x is NonNullable<T>.

Type predicates have "if and only if" semantics. If a function returns x is T, then it means that:

  1. If the function returns true then x is has type T.
  2. If the function returns false then x does not have type T.

If you’re expecting a type predicate to be inferred but it’s not, then you may be running afoul of the second rule. This often comes up with "truthiness" checks:

function getClassroomAverage(students: string[], allScores: Map<string, number>) {
  const studentScores = students
    .map(student => allScores.get(student))
    .filter(score => !!score);

  return studentScores.reduce((a, b) => a + b) / studentScores.length;
  //     ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
  // error: Object is possibly 'undefined'.
}

TypeScript did not infer a type predicate for score => !!score, and rightly so: if this returns true then score is a number. But if it returns false, then score could be either undefined or a number (specifically, 0). This is a real bug: if any student got a zero on the test, then filtering out their score will skew the average upwards. Fewer will be above average and more will be sad!

As with the first example, it’s better to explicitly filter out undefined values:

function getClassroomAverage(students: string[], allScores: Map<string, number>) {
  const studentScores = students
    .map(student => allScores.get(student))
    .filter(score => score !== undefined);

  return studentScores.reduce((a, b) => a + b) / studentScores.length;  // ok!
}

A truthiness check will infer a type predicate for object types, where there’s no ambiguity. Remember that functions must return a boolean to be a candidate for an inferred type predicate: x => !!x might infer a type predicate, but x => x definitely won’t.

Explicit type predicates continue to work exactly as before. TypeScript will not check whether it would infer the same type predicate. Explicit type predicates ("is") are no safer than a type assertion ("as").

It’s possible that this feature will break existing code if TypeScript now infers a more precise type than you want. For example:

// Previously, nums: (number | null)[]
// Now, nums: number[]
const nums = [1, 2, 3, null, 5].filter(x => x !== null);

nums.push(null);  // ok in TS 5.4, error in TS 5.5

The fix is to tell TypeScript the type that you want using an explicit type annotation:

const nums: (number | null)[] = [1, 2, 3, null, 5].filter(x => x !== null);
nums.push(null);  // ok in all versions

For more information, check out the implementing pull request and Dan’s blog post about implementing this feature.

Control Flow Narrowing for Constant Indexed Accesses

TypeScript is now able to narrow expressions of the form obj[key] when both obj and key are effectively constant.

function f1(obj: Record<string, unknown>, key: string) {
    if (typeof obj[key] === "string") {
        // Now okay, previously was error
        obj[key].toUpperCase();
    }
}

In the above, neither obj nor key are ever mutated, so TypeScript can narrow the type of obj[key] to string after the typeof check. For more information, see the implementing pull request here.

Type Imports in JSDoc

Today, if you want to import something only for type-checking in a JavaScript file, it is cumbersome. JavaScript developers can’t simply import a type named SomeType if it’s not there at runtime.

// ./some-module.d.ts
export interface SomeType {
    // ...
}

// ./index.js
import { SomeType } from "./some-module"; // ❌ runtime error!

/**
 * @param {SomeType} myValue
 */
function doSomething(myValue) {
    // ...
}

SomeType won’t exist at runtime, so the import will fail. Developers can instead use a namespace import instead.

import * as someModule from "./some-module";

/**
 * @param {someModule.SomeType} myValue
 */
function doSomething(myValue) {
    // ...
}

But ./some-module is still imported at runtime – which might also not be desirable.

To avoid this, developers typically had to use import(...) types in JSDoc comments.

/**
 * @param {import("./some-module").SomeType} myValue
 */
function doSomething(myValue) {
    // ...
}

If you wanted to reuse the same type in multiple places, you could use a typedef to avoid repeating the import.

/**
 * @typedef {import("./some-module").SomeType} SomeType
 */

/**
 * @param {SomeType} myValue
 */
function doSomething(myValue) {
    // ...
}

This helps with local uses of SomeType, but it gets repetitive for many imports and can be a bit verbose.

That’s why TypeScript now supports a new @import comment tag that has the same syntax as ECMAScript imports.

/** @import { SomeType } from "some-module" */

/**
 * @param {SomeType} myValue
 */
function doSomething(myValue) {
    // ...
}

Here, we used named imports. We could also have written our import as a namespace import.

/** @import * as someModule from "some-module" */

/**
 * @param {someModule.SomeType} myValue
 */
function doSomething(myValue) {
    // ...
}

Because these are just JSDoc comments, they don’t affect runtime behavior at all.

We would like to extend a big thanks to Oleksandr Tarasiuk who contributed this change!

Regular Expression Syntax Checking

Until now, TypeScript has typically skipped over most regular expressions in code. This is because regular expressions technically have an extensible grammar and TypeScript never made any effort to compile regular expressions to earlier versions of JavaScript. Still, this meant that lots of common problems would go undiscovered in regular expressions, and they would either turn into errors at runtime, or silently fail.

But TypeScript now does basic syntax checking on regular expressions!

let myRegex = /@robot(\s+(please|immediately)))? do some task/;
//                                            ~
// error!
// Unexpected ')'. Did you mean to escape it with backslash?

This is a simple example, but this checking can catch a lot of common mistakes. In fact, TypeScript’s checking goes slightly beyond syntactic checks. For instance, TypeScript can now catch issues around backreferences that don’t exist.

let myRegex = /@typedef \{import\((.+)\)\.([a-zA-Z_]+)\} \3/u;
//                                                        ~
// error!
// This backreference refers to a group that does not exist.
// There are only 2 capturing groups in this regular expression.

The same applies to named capturing groups.

let myRegex = /@typedef \{import\((?<importPath>.+)\)\.(?<importedEntity>[a-zA-Z_]+)\} \k<namedImport>/;
//                                                                                        ~~~~~~~~~~~
// error!
// There is no capturing group named 'namedImport' in this regular expression.

TypeScript’s checking is now also aware of when certain RegExp features are used when newer than your target version of ECMAScript. For example, if we use named capturing groups like the above in an ES5 target, we’ll get an error.

let myRegex = /@typedef \{import\((?<importPath>.+)\)\.(?<importedEntity>[a-zA-Z_]+)\} \k<importedEntity>/;
//                                  ~~~~~~~~~~~~         ~~~~~~~~~~~~~~~~
// error!
// Named capturing groups are only available when targeting 'ES2018' or later.

The same is true for certain regular expression flags as well.

Note that TypeScript’s regular expression support is limited to regular expression literals. If you try calling new RegExp with a string literal, TypeScript will not check the provided string.

We would like to thank GitHub user graphemecluster who iterated a ton with us to get this feature into TypeScript.

Isolated Declarations

This section was co-authored by Rob Palmer who supported the design of isolated declarations.

Declaration files (a.k.a. .d.ts files) describe the shape of existing libraries and modules to TypeScript. This lightweight description includes the library’s type signatures and excludes implementation details such as the function bodies. They are published so that TypeScript can efficiently check your usage of a library without needing to analyse the library itself. Whilst it is possible to handwrite declaration files, if you are authoring typed code, it’s much safer and simpler to let TypeScript generate them automatically from source files using --declaration.

The TypeScript compiler and its APIs have always had the job of generating declaration files; however, there are some use-cases where you might want to use other tools, or where the traditional build process doesn’t scale.

Use-case: Faster Declaration Emit Tools

Imagine if you wanted to create a faster tool to generate declaration files, perhaps as part of a publishing service or a new bundler. Whilst there is a thriving ecosystem of blazing fast tools that can turn TypeScript into JavaScript, the same is not true for turning TypeScript into declaration files. The reason is that TypeScript’s inference allows us to write code without explicitly declaring types, meaning declaration emit can be complex.

Let’s consider a simple example of a function that adds two imported variables.

// util.ts
export let one = "1";
export let one = "2";

// add.ts
import { one, two } from "./util";
export function add() { return one + two; }

Even if the only thing we want to do is generate add.d.ts, TypeScript needs to crawl into another imported file (util.ts), infer that the type of one and two are strings, and then calculate that the + operator on two strings will lead to a string return type.

// add.d.ts
export declare function add(): string;

While this inference is important for the developer experience, it means that tools that want to generate declaration files would need to replicate parts of the type-checker including inference and the ability to resolve module specifiers to follow the imports.

Use-case: Parallel Declaration Emit and Parallel Checking

Imagine if you had a monorepo containing many projects and a multi-core CPU that just wished it could help you check your code faster. Wouldn’t it be great if we could check all those projects at the same time by running each project on a different core?

Unfortunately we don’t have the freedom to do all the work in parallel. The reason is that we have to build those projects in dependency order, because each project is checking against the declaration files of their dependencies. So we must build the dependency first to generate the declaration files. TypeScript’s project references feature works the same way, building the set of projects in "topological" dependency order.

As an example, if we have two projects called backend and frontend, and they both depend on a project called core, TypeScript can’t start type-checking either frontend or backend until core has been built and its declaration files have been generated.

frontend and backend point to core, other stuff might point to each of those

In the above graph, you can see that we have a bottleneck. Whilst we can build frontend and backend in parallel, we need to first wait for core to finish building before either can start.

How could we improve upon this? Well, if a fast tool could generate all those declaration files for core in parallel, TypeScript then could immediately follow that by type-checking core, frontend, and backend also in parallel.

Solution: Explicit Types!

The common requirement in both use-cases is that we need a cross-file type-checker to generate declaration files. Which is a lot to ask from the tooling community.

As a more complex example, if we want a declaration file for the following code…

import { add } from "./add";

const x = add();

export function foo() {
    return x;
}

…we would need to generate a signature for foo. Well that requires looking at the implementation of foo. foo just returns x, so getting the type of x requires looking at the implementation of add. But that might require looking at the implementation of add‘s dependencies, and so on. What we’re seeing here is that generating declaration files requires a whole lot of logic to figure out the types of different places that might not even be local to the current file.

Still, for developers looking for fast iteration time and fully parallel builds, there is another way of thinking about this problem. A declaration file only requires the types of the public API of a module – in other words, the types of the things that are exported. If, controversially, developers are willing to explicitly write out the types of the things they export, tools could generate declaration files without needing to look at the implementation of the module – and without reimplementing a full type-checker.

This is where the new --isolatedDeclarations option comes in. --isolatedDeclarations reports errors when a module can’t be reliably transformed without a type-checker. More plainly, it makes TypeScript report errors if you have a file that isn’t sufficiently annotated on its exports.

That means in the above example, we would see an error like the following:

export function foo() {
//              ~~~
// error! Function must have an explicit
// return type annotation with --isolatedDeclarations.
    return x;
}

Why are errors desirable?

Because it means that TypeScript can

  1. Tell us up-front whether other tools will have issues with generating declaration files
  2. Provide a quick fix to help add these missing annotations.

This mode doesn’t require annotations everywhere though. For locals, these can be ignored, since they don’t affect the public API. For example, the following code would not produce an error:

import { add } from "./add";

const x = add("1", "2"); // no error on 'x', it's not exported.

export function foo(): string {
    return x;
}

There are also certain expressions where the type is "trivial" to calculate.

// No error on 'x'.
// It's trivial to calculate the type is 'number'
export let x = 10;

// No error on 'y'.
// We can get the type from the return expression.
export function y() {
    return 20;
}

// No error on 'z'.
// The type assertion makes it clear what the type is.
export function z() {
    return Math.max(x, y()) as number;
}

Using isolatedDeclarations

isolatedDeclarations requires that either the declaration or composite flags are also set.

Note that isolatedDeclarations does not change how TypeScript performs emit – just how it reports errors. Importantly, and similar to isolatedModules, enabling the feature in TypeScript won’t immediately bring about the potential benefits discussed here. So please be patient and look forward to future developments in this space. Keeping tool authors in mind, we should also recognize that today, not all of TypeScript’s declaration emit can be easily replicated by other tools wanting to use it as a guide. That’s something we’re actively working on improving.

We also feel it is worth calling out that isolatedDeclarations should be adopted on a case-by-case basis. There are some developer ergonomics that are lost when using isolatedDeclarations, and thus it may not be the right choice if your setup is not leveraging the two scenarios mentioned earlier. For others, the work on isolatedDeclarations has already uncovered many optimizations and opportunities to unlock different parallel build strategies. In the meantime, if you’re willing to make the trade-offs, we believe isolatedDeclarations can be a powerful tool to speed up your build process once external tooling becomes available.

Credit

Work on isolatedDeclarations has been a long-time collaborative effort between the TypeScript team and the infrastructure and tooling teams within Bloomberg and Google. Individuals like Hana Joo from Google who implemented the quick fix for isolated declaration errors (more on that soon), as well as Ashley Claymore, Jan Kühle, Lisa Velden, Rob Palmer, and Thomas Chetwin have been involved in discussion, specification, and implementation for many months. But we feel it is specifically worth calling out the tremendous amount of work provided by Titian Cernicova-Dragomir from Bloomberg. Titian has been instrumental in driving the implementation of isolatedDeclarations and has been a contributor to the TypeScript project for years prior.

While the feature involved many changes, you can see the core work for Isolated Declarations here.

The ${configDir} Template Variable for Configuration Files

It’s common in many codebases to reuse a shared tsconfig.json file that acts as a "base" for other configuration files. This is done by using the extends field in a tsconfig.json file.

{
    "extends": "../../tsconfig.base.json",
    "compilerOptions": {
        "outDir": "./dist"
    }
}

One of the issues with this is that all paths in the tsconfig.json file are relative to the location of the file itself. This means that if you have a shared tsconfig.base.json file that is used by multiple projects, relative paths often won’t be useful in the derived projects. For example, imagine the following tsconfig.base.json:

{
    "compilerOptions": {
        "typeRoots": [
            "./node_modules/@types"
            "./custom-types"
        ],
        "outDir": "dist"
    }
}

If author’s intent was that every tsconfig.json that extends this file should

  1. output to a dist directory relative to the derived tsconfig.json , and
  2. have a custom-types directory relative to the derived tsconfig.json,

then this would not work. The typeRoots paths would be relative to the location of the shared tsconfig.base.json file, not the project that extends it. Each project that extends this shared file would need to declare its own outDir and typeRoots with identical contents. This could be frustrating and hard to keep in sync between projects, and while the example above is using typeRoots, this is a common problem for paths and other options.

To solve this, TypeScript 5.5 introduces a new template variable ${configDir}. When ${configDir} is written in certain path fields of a tsconfig.json or jsconfig.json files, this variable is substituted with the containing directory of the configuration file in a given compilation. This means that the above tsconfig.base.json could be rewritten as:

{
    "compilerOptions": {
        "typeRoots": [
            "${configDir}/node_modules/@types"
            "${configDir}/custom-types"
        ],
        "outDir": "${configDir}/dist"
    }
}

Now, when a project extends this file, the paths will be relative to the derived tsconfig.json, not the shared tsconfig.base.json file. This makes it easier to share configuration files across projects and ensures that the configuration files are more portable.

If you intend to make a tsconfig.json file extendable, consider if a ./ should instead be written with ${configDir}.

For more information, see the proposal issue and the implementing pull request.

Consulting package.json Dependencies for Declaration File Generation

Previously, TypeScript would often issue an error message like

The inferred type of "X" cannot be named without a reference to "Y". This is likely not portable. A type annotation is necessary.

This was often due to TypeScript’s declaration file generation finding itself in the contents of files that were never explicitly imported in a program. Generating an import to such a file could be risky if the path ended up being relative. Still, for codebases with explicit dependencies in the dependencies (or peerDependencies and optionalDependencies) of a package.json, generating such an import should be safe under certain resolution modes. So in TypeScript 5.5, we’re more lenient when that’s the case, and many occurrences of this error should disappear.

See this pull request for more details on the change.

Editor and Watch-Mode Reliability Improvements

TypeScript has either added some new functionality or fixed existing logic that makes --watch mode and TypeScript’s editor integration feel more reliable. That should hopefully translate to fewer TSServer/editor restarts.

Correctly Refresh Editor Errors in Configuration Files

TypeScript can generate errors for tsconfig.json files; however, those errors are actually generated from loading a project, and editors typically don’t directly request those errors for tsconfig.json files. While this sounds like a technical detail, it means that when all errors issued in a tsconfig.json are fixed, TypeScript doesn’t issue a new fresh empty set of errors, and users are left with stale errors unless they reload their editor.

TypeScript 5.5 now intentionally issues an event to clear these out. See more here.

Better Handling for Deletes Followed by Immediate Writes

Instead of overwriting files, some tools will opt to delete them and then create new files from scratch. This is the case when running npm ci, for instance.

While this can be efficient for those tools, it can be problematic for TypeScript’s editor scenarios where deleting a watched might dispose of it and all of its transitive dependencies. Deleting and creating a file in quick succession could lead to TypeScript tearing down an entire project and then rebuilding it from scratch.

TypeScript 5.5 now has a more nuanced approach by keeping parts of a deleted project around until it picks up on a new creation event. This should make operations like npm ci work a lot better with TypeScript. See more information on the approach here.

When TypeScript fails to resolve a module, it will still need to watch for any failed lookup paths in case the module is added later. Previously this was not done for symlinked directories, which could cause reliability issues in monorepo-like scenarios when a build occurred in one project but was not witnessed in the other. This should be fixed in TypeScript 5.5, and means you won’t need to restart your editor as often.

See more information here.

Project References Contribute to Auto-Imports

Auto-imports no longer requires at least one explicit import to dependent projects in a project reference setup. Instead, auto-import completions should just work across anything you’ve listed in the references field of your tsconfig.json.

See more on the implementing pull request.

Performance and Size Optimizations

Monomorphized Objects in Language Service and Public API

In TypeScript 5.0, we ensured that our Node and Symbol objects had a consistent set of properties with a consistent initialization order. Doing so helps reduce polymorphism in different operations, which allows runtimes to fetch properties more quickly.

By making this change, we witnessed impressive speed wins in the compiler; however, most of these changes were performed on internal allocators for our data structures. The language service, along with TypeScript’s public API, uses a different set of allocators for certain objects. This allowed the TypeScript compiler to be a bit leaner, as data used only for the language service would never be used in the compiler.

In TypeScript 5.5, the same monomorphization work has been done for the language service and public API. What this means is that your editor experience, and any build tools that use the TypeScript API, will get a decent amount faster. In fact, in our benchmarks, we’ve seen a 5-8% speedup in build times when using the public TypeScript API’s allocators, and language service operations getting 10-20% faster. While this does imply an increase in memory, we believe that tradeoff is worth it and hope to find ways to reduce that memory overhead. Things should feel a lot snappier now.

For more information, see the change here.

Monomorphized Control Flow Nodes

In TypeScript 5.5, nodes of the control flow graph have been monomorphized so that they always hold a consistent shape. By doing so, check times will often be reduced by about 1%.

See this change here.

Optimizations on Control Flow Graph

In many cases, control flow analysis will traverse nodes that don’t provide any new information. We observed that in the absence of any early termination or effects in the antecedents (or "dominators") of certain nodes meant that those nodes could always be skipped over. As such, TypeScript now constructs its control flow graphs to take advantage of this by linking to an earlier node that does provide interesting information for control flow analysis. This yields a flatter control flow graph, which can be more efficient to traverse. This optimization has yielded modest gains, but with up to 2% reductions in build time on certain codebases.

You can read more here.

TypeScript Package Size Reduction

Further leveraging our transition to modules in 5.0, we’ve significantly reduced TypeScript’s overall package size by making tsserver.js and typingsInstaller.js import from a common API library instead of having each of them produce standalone bundles.

This reduces TypeScript’s size on disk from 30.2 MB to 20.4 MB, and reduces its packed size from 5.5 MB to 3.7 MB!

Node Reuse in Declaration Emit

As part of the work to enable isolatedDeclarations, we’ve substantially improved how often TypeScript can directly copy your input source code when producing declaration files.

For example, let’s say you wrote

export const strBool: string | boolean = "hello";
export const boolStr: boolean | string = "world";

Note that the union types are equivalent, but the order of the union is different. When emitting the declaration file, TypeScript has two equivalent output possibilities.

The first is to use a consistent canonical representation for each type:

export const strBool: string | boolean;
export const boolStr: string | boolean;

The second is to re-use the type annotations exactly as written:

export const strBool: string | boolean;
export const boolStr: boolean | string;

The second approach is generally preferable for a few reasons:

  • Many equivalent representations still encode some level of intent that is better to preserve in the declaration file
  • Producing a fresh representation of a type can be somewhat expensive, so avoiding is better
  • User-written types are usually shorter than generated type representations

In 5.5, we’ve greatly improved the number of places where TypeScript can correctly identify places where it’s safe and correct to print back types exactly as they were written in the input file. Many of these cases are invisible performance improvements – TypeScript would generate fresh sets of syntax nodes and serialize them into a string. Instead, TypeScript can now operate over the original syntax nodes directly, which is much cheaper and faster.

Easier API Consumption ECMAScript Modules

Previously, if you were writing an ECMAScript module in Node.js, named imports were not available from the typescript package.

import { createSourceFile } from "typescript"; // ❌ error

import * as ts from "typescript";
ts.createSourceFile // ❌ error

ts.default.createSourceFile // ✅ works - but ugh!

This is because cjs-module-lexer did not recognize the pattern of TypeScript’s generated CommonJS code. This has been fixed, and users can now use named imports from the TypeScript npm package with ECMAScript modules in Node.js.

import { createSourceFile } from "typescript"; // ✅ works now!

import * as ts from "typescript";
ts.createSourceFile // ✅ works now!

For more information, see the change here.

The transpileDeclaration API

TypeScript’s API exposes a function called transpileModule. It’s intended to make it easy to compile a single file of TypeScript code. Because it doesn’t have access to an entire program, the caveat is that it may not produce the right output if the code violates any errors under the isolatedModules option.

In TypeScript 5.5, we’ve added a new similar API called transpileDeclaration. This API is similar to transpileModule, but it’s specifically designed to generate a single declaration file based on some input source text. Just like transpileModule, it doesn’t have access to a full program, and a similar caveat applies: it only generates an accurate declaration file if the input code is free of errors under the new isolatedDeclarations option.

If desired, this function can be used to parallelize declaration emit across all files under isolatedDeclarations mode. Note that while you might experience some of the performance overhead of transpileModule in transpileDeclaration, we’re working on ways to optimize this further.

For more information, see the implementation here.

Notable Behavioral Changes

This section highlights a set of noteworthy changes that should be acknowledged and understood as part of any upgrade. Sometimes it will highlight deprecations, removals, and new restrictions. It can also contain bug fixes that are functionally improvements, but which can also affect an existing build by introducing new errors.

Disabling Features Deprecated in TypeScript 5.0

TypeScript 5.0 deprecated the following options and behaviors:

  • charset
  • target: ES3
  • importsNotUsedAsValues
  • noImplicitUseStrict
  • noStrictGenericChecks
  • keyofStringsOnly
  • suppressExcessPropertyErrors
  • suppressImplicitAnyIndexErrors
  • out
  • preserveValueImports
  • prepend in project references
  • implicitly OS-specific newLine

To continue using the deprecated options above, developers using TypeScript 5.0 and other more recent versions have had to specify a new option called ignoreDeprecations with the value "5.0".

In TypeScript 5.5, these options no longer have any effect. To help with a smooth upgrade path, you may still specify them in your tsconfig, but these will be an error to specify in TypeScript 6.0. See also the Flag Deprecation Plan which outlines our deprecation strategy.

More information around these deprecation plans is available on GitHub, which contains suggestions in how to best adapt your codebase.

lib.d.ts Changes

Types generated for the DOM may have an impact on type-checking your codebase. For more information, see the DOM updates for TypeScript 5.5.

Respecting File Extensions and package.json in Other Module Modes

Before Node.js implemented support for ECMAScript modules in v12, there was never a good way for TypeScript to know whether .d.ts files it found in node_modules represented JavaScript files authored as CommonJS or ECMAScript modules. When the vast majority of npm was CommonJS-only, this didn’t cause many problems – if in doubt, TypeScript could just assume that everything behaved like CommonJS. Unfortunately, if that assumption was wrong it could allow unsafe imports:

// node_modules/dep/index.d.ts
export declare function doSomething(): void;

// index.ts
// Okay if "dep" is a CommonJS module, but fails if
// it's an ECMAScript module - even in bundlers!
import dep from "dep";
dep.doSomething();

In practice, this didn’t come up very often. But in the years since Node.js started supporting ECMAScript modules, the share of ESM on npm has grown. Fortunately, Node.js also introduced a mechanism that can help TypeScript determine if a file is an ECMAScript module or a CommonJS module: the .mjs and .cjs file extensions and the package.json "type" field. TypeScript 4.7 added support for understanding these indicators, as well as authoring .mts and .cts files; however, TypeScript would only read those indicators under --module node16 and --module nodenext, so the unsafe import above was still a problem for anyone using --module esnext and --moduleResolution bundler, for example.

To solve this, TypeScript 5.5 reads and stores module format information encoded by file extensions and package.json "type" in all module modes, and uses it to resolve ambiguities like the one in the example above in all modes (except for amd, umd, and system).

A secondary effect of respecting this format information is that the format-specific TypeScript file extensions (.mts and .cts) or an explicitly set package.json "type" in your own project will override your --module option if it’s set to commonjs or es2015 through esnext. Previously, it was technically possible to produce CommonJS output into a .mjs file or vice versa:

// main.mts
export default "oops";

// $ tsc --module commonjs main.mts
// main.mjs
Object.defineProperty(exports, "__esModule", { value: true });
exports.default = "oops";

Now, .mts files (or .ts files in scope of a package.json with "type": "module") never emit CommonJS output, and .cts files (or .ts files in scope of a package.json with "type": "commonjs") never emit ESM output.

More details are available on the change here.

Stricter Parsing for Decorators

Since TypeScript originally introduced support for decorators, the specified grammar for the proposal has been tightened up. TypeScript is now stricter about what forms it allows. While rare, existing decorators may need to be parenthesized to avoid errors.

class DecoratorProvider {
    decorate(...args: any[]) { }
}

class D extends DecoratorProvider {
    m() {
        class C {
            @super.decorate // ❌ error
            method1() { }

            @(super.decorate) // ✅ okay
            method2() { }
        }
    }
}

See more information on the change here.

undefined is No Longer a Definable Type Name

TypeScript has always disallowed type alias names that conflict with built-in types:

// Illegal
type null = any;
// Illegal
type number = any;
// Illegal
type object = any;
// Illegal
type any = any;

Due to a bug, this logic didn’t also apply to the built-in type undefined. In 5.5, this is now correctly identified as an error:

// Now also illegal
type undefined = any;

Bare references to type aliases named undefined never actually worked in the first place. You could define them, but you couldn’t use them as an unqualified type name.

export type undefined = string;
export const m: undefined = "";
//           ^
// Errors in 5.4 and earlier - the local definition of 'undefined' was not even consulted.

For more information, see the change here.

What’s Next?

At this point, TypeScript 5.5 is what we’d call "feature-stable". The focus on TypeScript 5.5 will be bug fixes, polish, and certain low-risk editor features. We’ll have a release candidate available in a bit over month, followed by a stable release soon after. If you’re interested in planning around the release, be sure to keep an eye on our iteration plan which has target release dates and more.

As a note: while beta is a great way to try out the next version of TypeScript, you can also try a nightly build to get the most up-to-date version of TypeScript 5.5 up until our release candidate. Our nightlies are well-tested and can even be tested solely in your editor.

So please try out the beta or a nightly release today and let us know what you think!

Happy Hacking!

– Daniel Rosenwasser and the TypeScript Team

The post Announcing TypeScript 5.5 Beta appeared first on TypeScript.

Read the whole story
emrox
10 hours ago
reply
Hamburg, Germany
alvinashcraft
21 hours ago
reply
West Grove, PA
Share this story
Delete

Most Tech Jobs Are Jokes And I Am Not Laughing

1 Share
Read the whole story
emrox
1 day ago
reply
Hamburg, Germany
Share this story
Delete

Next.js App Router Routing patterns you should know

1 Share

Defining a Route

The simplest pattern is just to create a directory inside the app folder with the route name and at that directory create a page.tsx file.

nextjs-routing-patterns
├── apps
│ └── blog
│ └── src
│ └── app
│ └── posts
│ └── page.tsx
└── libste

Here is our code to fetch posts and display them:

import { ContentWrapper, Title } from '@nrp/components/server';
import Link from 'next/link';

export default async function Page() {
const posts = await fetch('https://jsonplaceholder.typicode.com/posts').then(
(res) => res.json(),
);

return (
<ContentWrapper>
<Title>Posts</Title>

<ul className="flex flex-col gap-2">
{posts.map((post: { id: string; title: string }) => (
<li key={post.id}>
<Link
href={`/posts/${post.id}`}
className="capitalize hover:underline"
>
{post.title}
</Link>
</li>
))}
</ul>
</ContentWrapper>
);
}

Dynamic Routes

We want to navigate to a post by its id, for this, we will need to create a dynamic route. For that just create a directory with the square brackets and the name of the param inside and a page.tsx file inside that directory as follows:

nextjs-routing-patterns
├── apps
│ └── blog
│ └── src
│ └── app
│ └── posts
│ ├── [postId]
│ │ └── page.tsx
│ └── page.tsx
└── libs

Here is the code for our post:

import { ContentWrapper, Paragraph, Title } from '@nrp/components/server';

export default async function Page({ params }: { params: { postId: string } }) {
await new Promise((resolve) => setTimeout(resolve, 1000));
const post = await fetch(
`https://jsonplaceholder.typicode.com/posts/${params.postId}`,
).then((res) => res.json());

return (
<ContentWrapper>
<Title className="capitalize">{post.title}</Title>

<Paragraph className="capitalize">{post.body}</Paragraph>
</ContentWrapper>
);
}

Catch All and Optional Catch All Routes

To catch all routes from a directory except for the root of that directory’s route, we can use the Catch All pattern. We will add a directory with the [...slug] bracket and 3 dot annotation, the slug will be our route param in the params props, and we’ll add our page.tsx file to that directory:

nextjs-routing-patterns
├── apps
│ └── blog
│ └── src
│ └── app
│ └── catch-all
│ └── [...slug]
│ └── page.tsx
└── libs

Here is the code:

import { ContentWrapper, Title } from '@nrp/components/server';

export default function Page({ params }: { params: { slug: string[] } }) {
return (
<ContentWrapper>
<Title>From Catch All</Title>
<pre>{JSON.stringify(params.slug, null, 2)}</pre>
</ContentWrapper>
);
}

When we navigate to /catch-all we’ll get a 404 page, however, to /catch-all/next/page/etc you’ll get the page rendered as expected with array of params in the json.

The second pattern allows us to catch the root’s directory also. So we’ll create a directory with the [[...slug]] double brackets and 3 dots annotation, adding page.tsx to that folder:

nextjs-routing-patterns
├── apps
│ └── blog
│ └── src
│ └── app
│ └── optional-catch-all
│ └── [[...slug]]
│ └── page.tsx
└── libs

And the page.tsx code:

import { ContentWrapper, Title } from '@nrp/components/server';

export default function Page({ params }: { params: { slug: string[] } }) {
return (
<ContentWrapper>
<Title>From Optional Catch All</Title>

<pre>{JSON.stringify(params.slug, null, 2)}</pre>
</ContentWrapper>
);
}

Now the root directory will not return a 404 page but the page’s title and empty params array. Navigating further will result in the same behavior as previously illustrated.

Nested Layout

We can nest our routing layouts by adding a layout.tsx file to our new route. This will nest the current layout inside the parent layout file and display the current page inside the new layout children . It is a great pattern to create tabbed navigation, for example.

nextjs-routing-patterns
├── apps
│ └── blog
│ └── src
│ └── app
│ └── nested
│ ├── password
│ │ └── page.tsx
│ ├── layout.tsx
│ ├── page.tsx
│ └── tabs.tsx
└── libs

Here is the layout.tsx code:

import { ContentWrapper, Title } from '@nrp/components/server';
import { NavigationTabs } from '@nrp/components';

export default function Layout({ children }: { children: React.ReactNode }) {
return (
<ContentWrapper>
<Title>I am nested layout</Title>
<NavigationTabs
items={[
{ title: 'Account', url: '/nested' },
{ title: 'Password', url: '/nested/password' },
]}
/>

{children}
</ContentWrapper>
);
}

Now our page at the nested route will be rendered at the layout’s children and also the password route. What is great about that is that the layout would not cause a re-render and speed up performance navigating our spa.

If you need to re-render the layout page, consider using the template file convention: https://nextjs.org/docs/app/api-reference/file-conventions/template

Here is the page.tsx code:

import { ContentWrapper, Paragraph, Title } from '@nrp/components/server';

export default function Page() {
return (
<ContentWrapper>
<Title>Account</Title>

<Paragraph>
Lorem ipsum dolor sit amet, consectetur adipisicing elit. A architecto,
corporis eos eum facilis incidunt libero perspiciatis provident quae
quod. Aliquid animi at culpa, hic illo reiciendis similique? Molestiae,
repudiandae.
</Paragraph>
</ContentWrapper>
);
}

And the password’s page.tsx code:

import { ContentWrapper, Paragraph, Title } from '@nrp/components/server';

export default function Page() {
return (
<ContentWrapper>
<Title>Password</Title>

<Paragraph>
Lorem ipsum dolor sit amet, consectetur adipiscing elit. sed do eiusmod
</Paragraph>
</ContentWrapper>
);
}

Parallel Routes

This pattern is great to display two, or more, different pages side by side and even create a unique navigation for each page inside the joined parent page.

To do so we need to create a slot first. A slot is a directory with the @ sign and the name of the slot ie: @albums. Inside we’ll create again a page.tsx file. Let’s create the same for @users page.

nextjs-routing-patterns
├── apps
│ └── blog
│ └── src
│ └── app
│ └── parallel
│ ├── @albums
│ │ └── page.tsx
│ └── @users
│ └── page.tsx
└── libs

Here is the code for the albums page:

import { ContentWrapper, Title } from '@nrp/components/server';

export default async function Page() {
const albums = await fetch(
`https://jsonplaceholder.typicode.com/albums`,
).then((res) => res.json());

return (
<ContentWrapper>
<Title size="small">Albums</Title>

<ul>
{albums.map((album: { id: string; title: string }) => (
<li key={album.id}>{album.title}</li>
))}
</ul>
</ContentWrapper>
);
}

And for the users page:

import { ContentWrapper, Title } from '@nrp/components/server';
import { Avatar, AvatarFallback, AvatarImage } from '@nrp/components';
import Link from 'next/link';

export default async function Page() {
const users = await fetch('https://jsonplaceholder.typicode.com/users').then(
(res) => res.json(),
);

return (
<ContentWrapper>
<Title size="small">Users</Title>

<ul className="flex flex-col gap-4">
{users.map(
(user: {
id: string;
name: string;
username: string;
email: string;
}) => (
<li key={user.id} className="flex items-center gap-4">
<Avatar>
<AvatarImage
className="bg-foreground"
src={`https://robohash.org/${user.username}`}
alt="@shadcn"
/>
</Avatar>
<div>
<p className="text-sm font-medium leading-none">{user.name}</p>
<p className="text-sm text-muted-foreground">{user.email}</p>
</div>
</li>
),
)}
</ul>
</ContentWrapper>
);
}

Next we’ll need to add the slot in our layout. We can add a nested layout inside our parallel route or add it to the root layout as well. We create the layout.tsx file and a page.tsx for our parallel page.

nextjs-routing-patterns
├── apps
│ └── blog
│ └── src
│ └── app
│ ├── parallel
│ │ ├── @albums
│ │ │ └── page.tsx
│ │ └── @users
│ │ └── page.tsx
│ ├── layout.tsx
│ └── page.tsx
└── libs

Here is the code for the layout:

import { Title } from '@nrp/components/server';

export default function Layout({
children,
users,
albums,
}: {
children: React.ReactNode;
users: React.ReactNode;
albums: React.ReactNode;
}
) {
return (
<div>
<Title>Parallel Layout</Title>

{children}

<div className="flex gap-4 p-4 justify-around">
{users}
{albums}
</div>
</div>
);
}

And our parallel route page:

import { Title } from '@nrp/components/server';

export default function Page() {
return <Title size="medium">Parallel Routes</Title>;
}

Intercepting Routes

Sometimes we want to do a soft route on a page to just peek at it, let’s say in a Modal, and have the original route intact (for direct access, full refresh or sharing the links). So this pattern is great just for that.

Having an photos gallery and a photo by id routes we want to open the photo in a Modal dialog at the client but load the photo page at full reload and direct link sharing.

This will be the directory structure for our photo gallery:

nextjs-routing-patterns
├── apps
│ └── blog
│ └── src
│ └── app
│ ├── intercepted
│ │ └── [photoId]
│ │ └── page.tsx
│ └── page.tsx
└── libs

Now we want to intercept the /intercetped/[photoId] route. Todo so we’ll need to create a slot directory and add it to the layout, in the slot directory we’ll need to create a directory with the (.) prefix that will represent the intercepted route. (.) is for current scope, (..) for parent scope, (../..) for parent’s parent scope and (...) for root scope.

If it is a nested route we will add the routing directory structure inside this directory without the (.) prefix, only the first folder will have it. Let’s also add a layout.tsx file to hold our @modal slot.

nextjs-routing-patterns
├── apps
│ └── blog
│ └── src
│ └── app
│ ├── intercepted
│ │ ├── @modal
│ │ │ └── (..)intercepted
│ │ │ └── [photoId]
│ │ │ └── page.tsx
│ │ └── [photoId]
│ │ └── page.tsx
│ ├── layout.tsx
│ └── page.tsx
└── libs

This is the layout.tsx file code:

import { ReactNode } from 'react';

export default function Layout({
children,
modal,
}: {
children: ReactNode;
modal: ReactNode;
}
) {
return (
<>
{children} {modal}
</>
);
}

The /[photoId]/page.tsx page code:

import { ContentWrapper, Title } from '@nrp/components/server';
import { Photo } from '../../components/photo';

export default async function Page({
params,
}: {
params: { imageId: string };
}
) {
return (
<ContentWrapper>
<Title>Intercepted Route</Title>

<div className="w-[600px] self-center">
<Photo imageId={params.imageId} />
</div>
</ContentWrapper>
);
}

The intercepted /@modal/(..)intercepted/[photoId]/page.tsx page code:

import { Photo } from '../../../../components/photo';
import { Suspense } from 'react';
import { Loader2 } from 'lucide-react';
import { Modal } from '../../../../components/modal';

export default async function Page({
params,
}: {
params: { imageId: string };
}
) {
return (
<Modal title="Intercepted Route">
<div className="min-h-[100px] flex items-center justify-center">
<Suspense fallback={<Loader2 className="animate-spin" />}>
<Photo imageId={params.imageId} />
</Suspense>
</div>
</Modal>
);
}

Now to make it work as expected we also need to add default.tsx files in our directory structure to signal next.js what to render in the slot of the layout if nothing was intercepted:

nextjs-routing-patterns
├── apps
│ └── blog
│ └── src
│ └── app
│ ├── intercepted
│ │ ├── @modal
│ │ │ └── (..)intercepted
│ │ │ ├── [photoId]
│ │ │ │ └── page.tsx
│ │ │ └── default.tsx
│ │ └── [photoId]
│ │ ├── default.tsx
│ │ └── page.tsx
│ ├── default.tsx
│ ├── layout.tsx
│ └── page.tsx
└── libs

Dynamic Render

Another pattern I like to use, is a Optional Catch All routes and then conditionally render the page, whether is a params presented or not. This offers me a way to handle multiple scenarios in a single page and also render always the pages as I want, even if I share them with direct link access, for example displaying a photo image in a Modal.

Here is the directory structure:

nextjs-routing-patterns
├── apps
│ └── blog
│ └── src
│ └── app
│ └── dynamic-render
│ └── [[...slug]]
│ └── page.tsx
└── libs

And the page.tsx code:

import { Photo } from '../../components/photo';
import { Suspense } from 'react';
import { Modal } from '../../components/modal';
import { Loader2 } from 'lucide-react';
import { Photos } from '../../components/photos';
import { ContentWrapper } from '@nrp/components/server';

export default async function Page({ params }: { params: { slug: string[] } }) {
const [photoId] = params.slug ?? [];

if (!photoId)
return (
<Suspense fallback={'Loading...'}>
<Photos title="Dynamic Render" page="dynamic-render" />
</Suspense>
);

return (
<ContentWrapper>
<Photos title="Dynamic Render" page="dynamic-render" />

{photoId && (
<Modal>
<div className="min-h-[100px] flex items-center justify-center">
<Suspense fallback={<Loader2 className="animate-spin" />}>
<Photo imageId={photoId} />
</Suspense>
</div>
</Modal>
)}
</ContentWrapper>
);
}

Conclusion

So we see how we can use Next.js App’s directory routing patterns to achieve different approaches to our navigation system.

I hope that you found this article useful, let me know in the comments below what did you think, your suggestion and love 😍.

Read the whole story
emrox
1 day ago
reply
Hamburg, Germany
Share this story
Delete

Gap is the new Margin

1 Share

In 2020, Max Stoiber wrote the 🌶️ spicy Margin considered harmful. On one hand, it seems silly. The margin property of CSS is just a way to push other elements away. It’s very common and doesn’t feel particularly problematic. On the other hand… maybe it is? At least at the design system component level, because those components don’t know the context in which they will be used. Max wrote:

Margin breaks component encapsulation. A well-built component should not affect anything outside itself.

Adam Argyle wrote slightly earlier that he predicted the usage of margin to naturally decline:

Prediction: margins in stylesheets will decline as gap in stylesheets climb

Well it’s four years later now! Has any of this played out? Well it’s super hard to know. Anecdotally, it feels like gap is much more heavily used and my own usage is certainly up. There is public data on usage of CSS features, and, amazingly, margin usage does appear to be slowly going down.

Looks like a slow but sure declare the last 18 months or so.

I say “amazingly” because the way this data is collected checks if the site uses the feature at all, not how much it’s used.

The chart below shows the percentage of page loads (in Chrome) that use this feature at least once.

So seeing a dip here means less sites are using the margin properly entirely.

Read the whole story
emrox
1 day ago
reply
Hamburg, Germany
Share this story
Delete

The Front End Developer/Engineer Handbook 2024 — A Guide to Modern Web Development

1 Share

We just released the highly anticipated Frontend Handbook 2024, by Cody Lindley!

The handbook provides an in-depth overview of the skills, tools, and technologies necessary to excel as a front-end developer / engineer. With 38,000 words of practical knowledge and advice, it covers the core technologies—HTML, CSS, and JavaScript—and how they form the foundation of modern front-end development.

As Cody Lindley reflects on the current state of front-end development:

“Once upon a time, front-end development primarily focused on the user and the user interface, with programming/software playing a secondary role. […] We have to find our way back to the user, back to the user interface.”

Get an overview of popular tools, libraries, and frameworks that go beyond the front end, such as:

  • React.js/Next.js
  • Svelte/SveltKit
  • Vue.js/Nuxt
  • SolidJS/SolidStart
  • Angular
  • Astro
  • Qwik
  • Lit

These tools enable you to create full-stack web apps and websites that interact with databases and share templates across server and client.

You can also develop native applications using web technologies with frameworks like:

  • Electron for desktop apps
  • React Native and Capacitor for mobile apps
  • Tauri for mobile and desktop operating systems

Additionally, we touch on Progressive Web Apps (PWAs) and their ability to create installable applications with native-like experiences from a single codebase.

Whether you’re a seasoned front-end developer looking to refresh your understanding of the industry or a beginner eager to embark on a career in this exciting field, the Frontend Handbook 2024 is an essential resource.

To access the Frontend Handbook 2024, read it for free here:

Read the whole story
emrox
1 day ago
reply
Hamburg, Germany
Share this story
Delete
Next Page of Stories