In 2024, I spent nearly a year refactoring my personal website. Since I first started practicing with Next.js and TypeScript in 2021, this project has accompanied me for three years. At that time, I was unfamiliar with React, and it was also my first time using TypeScript in a project, so I decided to use this project to practice the framework and language, as well as cultivate a writing habit. As my technical skills grew, I began refactoring and transforming the original codebase, which led to the current appearance of the website.
Written by: Chia1104 CC BY-NC-SA 4.0
In 2024, I spent nearly a year refactoring my personal website. Since I first built it with Next.js and TypeScript in early 2021, this project has been with me for three years. At that time, I was unfamiliar with React and using TypeScript in a project for the first time, so I decided to use this project to practice the framework, language, and cultivate a writing habit. As my technical skills grew, I began refactoring and transforming the original codebase, resulting in the website you see today.
In this refactoring, I set several clear objectives:
Initially, this website used static MDX to generate articles (next-mdx-remote). Later, I wanted to try updating articles through Next.js's ISR (Incremental Static Regeneration). At that time, Next.js 12's features had stabilized, so I came up with the idea of creating a backend to manage articles. Meanwhile, Turborepo had just been acquired by Vercel, and that's when I became familiar with the monorepo architecture.
It's been over two years since I started developing with the monorepo architecture, from fumbling through on my own to gradually mastering more complex project development—it's been a continuous learning process.
In the past, when working on side projects with colleagues, every time we started a new project, we had to copy methods or components from previous projects into the current one, wasting unnecessary time in the initial stages. This was another reason I wanted to try monorepo.
Of course, I encountered quite a few issues along the way, including adjustments to the deployment process and local development optimization. For example, using caching to speed up lint and type check—these were areas I discovered could be improved after implementation.
Initially, this project was planned to be developed following the T3 architecture, with tRPC as one of the core tools. It helps with frontend-backend API integration and shares type definitions for inputs and outputs. trpc
In the past, when developing APIs using Next.js API routes, you had to separately define types and validation for API inputs and outputs on both frontend and backend. Moreover, if the backend business logic was modified, the frontend also needed to check whether modifications and corresponding types were required, which was relatively time-consuming.
import type { NextApiRequest, NextApiResponse } from "next";
import { getSession } from "next-auth/react";
import { deleteFeed, getFeed } from "@/server/feed/services";
export default async function handler(
req: NextApiRequest,
res: NextApiResponse
) {
const session = await getSession({ req });
if (!session) {
return res.status(401).json({ message: "Unauthorized" });
}
switch (req.method) {
case "delete":
try {
const result = await deleteFeed(req.query.feedId as string);
return res.status(204).json(null);
} catch (error) {
console.log(error);
return res.status(400).json({ message: "Bad Request" });
}
case "get":
try {
const result = await getFeed(req.query.feedId as string);
return res.status(200).json(result);
} catch (error) {
console.log(error);
return res.status(400).json({ message: "Bad Request" });
}
default:
return res.status(405).json({ message: "Method Not Allowed" });
}
}const getFeedById = async (feedId: string): Promise<
ApiResponse<
{
id: string;
name: string;
// ...
}[]
>
> => {
try {
const res = await fetch(`/api/sign/${feedId}`, {
method: "GET",
credentials: "include",
});
if(!res.ok) {
// handle error
}
return await res.json();
} catch (error) {
// handle 500 error
}
};The introduction of tRPC solved this problem perfectly. I only need to spend time maintaining the backend service, while the frontend directly infers types from the service, and tRPC also handles error processing.
const feedsRouter = createTRPCRouter({
getFeedsWithMeta: protectedProcedure
.input(z.object({ /** DTO Schema **/ }))
.query((opts) => {
return getFeedsWithMeta(opts.ctx.db, opts.input);
}),
});
export const appRouter = createTRPCRouter({
feeds: feedsRouter,
});
// export type definition of API
export type AppRouter = typeof appRouter; "use client";
import { createTRPCReact } from "@trpc/react-query";
import type { AppRouter } from "@/server/trpc";
export const api = createTRPCReact<AppRouter>({
abortOnUnmount: true,
}); "use client";
const FeedList: FC<Props> = (props) => {
const { initFeed, nextCursor, query = {} } = props;
const {
data,
isSuccess,
fetchNextPage,
hasNextPage,
isFetchingNextPage,
isLoading,
} = api.feeds.getFeedsWithMeta.useInfiniteQuery(query, {
getNextPageParam: (lastPage) => lastPage?.nextCursor,
});
// ...
};tRPC is a relatively new full-stack framework that requires some time to familiarize yourself with during initial setup. However, once configured, subsequent development significantly reduces API integration time. tRPC is particularly suitable for:
Hono is a lightweight JavaScript framework that can run on multiple runtimes (Node, Bun, Deno), with syntax similar to Express.
import { Hono } from 'hono'
const app = new Hono()
app.get('/', (c) => c.text('Hono!'))
export default appIt also supports third-party middleware like tRPC and Auth.js, which were tools I was using at the time, making integration convenient. So I later chose to use Hono to develop some APIs.
Initially, the API I developed used Nest.js. The reason I chose Nest.js was that it was the first Node.js server framework I learned. Nest.js's advantages lie in project structure and management, and it's mostly developed with TypeScript. Initially, Nest.js's structure helped me quickly understand the architectural patterns a backend should have. Personally, I think if you're a backend beginner, Nest.js is a good choice.
However, Nest.js had a major issue in the version at that time—it primarily supported the CommonJS package format. By 2024, many mainstream packages only supported ES Module format, which greatly affected developer experience (DX). To use ES Module packages in CommonJS, you need to load them asynchronously, and that method must be asynchronous.
@Injectable()
export class AuthGuard implements CanActivate {
constructor(@Inject(DRIZZLE_PROVIDER) private readonly db: DB) {}
async canActivate(context: ExecutionContext): Promise<boolean> {
const createActionURL = (
await import("@auth/core")
).createActionURL;
// ...
}
}Moreover, Nest.js only supported Node runtime at that time, so even if I wanted to use Bun to solve the CommonJS and ES Module issue, it wasn't possible. So I later decisively switched to Hono and ran it with Bun, avoiding CommonJS and ES Module compatibility issues while ensuring developer experience.
Better auth is an authentication tool specifically for TypeScript projects, featuring the following main capabilities:
import { betterAuth } from "better-auth";
import { drizzleAdapter } from "better-auth/adapters/drizzle";
import { db } from "@/db"; // your drizzle instance
export const auth = betterAuth({
database: drizzleAdapter(db, {
provider: "pg", // or "mysql", "sqlite"
}),
socialProviders: {
github: {
clientId: process.env.GITHUB_CLIENT_ID,
clientSecret: process.env.GITHUB_CLIENT_SECRET,
}
},
});Better Auth was my final choice for authentication. The main reasons for choosing it include:
Store sessions through a secondary storage method, secondary storage
betterAuth({
// ... other options
secondaryStorage: {
// Your implementation here
}
})Initially, I used Auth.js. This tool is actually very convenient in Next.js projects—you only need to adjust config settings, and the frontend and backend have ready-made APIs available.
import NextAuth from "next-auth"
import GitHub from "next-auth/providers/github"
export const { auth, handlers } = NextAuth({ providers: [GitHub] })export { auth as middleware } from "@/libs/auth"import { handlers } from "@/libs/auth"
export const { GET, POST } = handlersHowever, Auth.js implements many methods internally, so customization takes relatively more time. Its default storage mode can only choose one adapter; to share Redis and PostgreSQL, you'd need to write another adapter. Moreover, it doesn't provide a more headless approach—when implemented in your own backend, corresponding actions only respond with Response, making it difficult to integrate with other services using JSON or objects.
const session = await auth.api.getSession({ headers: c.req.raw.headers });
// ^^^ Handle authenticationYou can use getSession to retrieve sessions, but their internal implementation is actually the same Auth.js - Express
import { createActionURL } from "@auth/core";
const url = createActionURL(
"session",
// ^^^ action name (getSession)
request.protocol,
new Headers(request.headers as HeadersInit),
process.env,
"/auth"
);
const response = await Auth(
new Request(url, { headers: { cookie: request.headers.cookie ?? "" } }),
{
secret: env.AUTH_SECRET,
}
);
const session = (await response.json()) as Session | null;
// ^^^ Handle authentication| Feature | Auth.js | Better Auth |
|---|---|---|
| Framework Support | Primarily optimized for Next.js | Supports multiple frameworks (React, Vue, Svelte, etc.) |
| Customization Difficulty | Higher (requires understanding internal implementation) | Lower (provides headless API) |
| Storage Solutions | Single adapter | Supports dual storage solutions (e.g., Redis + PostgreSQL) |
| API Response Format | Response object | JSON object |
| Built-in Features | OAuth, Email authentication | OAuth, Email, MFA, Passkey, OIDC |
| Ecosystem Maturity | Mature and stable | Newer but feature-complete |
| Use Cases | Next.js projects, requiring quick integration | Full-stack projects requiring high customization |
In the past, I also tried several SaaS solutions, but later thought since I already had my own database storing other data, I might as well simply implement authentication myself. However, I think these solutions are excellent choices for quickly building MVP applications.

Clerk provides many login methods and can be managed and configured in the backend. No additional method implementation is needed in the project—you can directly use their APIs to complete other project features.
Additionally, it has ready-made UI interfaces that can be placed directly into projects and supports many languages. Even native mobile applications (Kotlin and Swift) and Expo are supported. If you're truly developing products, Clerk would be an excellent choice.
![]()
Logto is similar to Clerk, but it's worth mentioning that Logto can be self-hosted. If you don't want to use these cloud versions, you can try deploying it yourself, which makes integration with some application projects easier to manage.
It also supports UI interfaces, but redirects to pages hosted by Logto itself.
In the past, I tried writing articles in MDX with next-mdx-remote. If articles are stored in the project folder, you can read them directly and pre-build these pages through Next.js.
import { serialize } from 'next-mdx-remote/serialize'
import { MDXRemote } from 'next-mdx-remote'
const components = { Test }
export default function Page({ source }) {
return (
<div className="wrapper">
<MDXRemote {...source} components={components} />
</div>
)
}
export async function getStaticProps() {
// MDX text - can be from a local file, database, anywhere
const source = 'Some **mdx** text'
const mdxSource = await serialize(source, {
parseFrontmatter: false,
mdxOptions: {
remarkPlugins: [],
rehypePlugins: [],
},
})
return { props: { source: mdxSource } }
}import { MDXRemote } from 'next-mdx-remote/rsc'
import { compileMDX } from "next-mdx-remote/rsc";
const components = {
h1: (props) => (
<h1 {...props} className="large-text">
{props.children}
</h1>
),
}
function MDXContent(props) {
return (
<MDXRemote
{...props}
components={{ ...components, ...(props.components || {}) }}
options={{
parseFrontmatter: false,
mdxOptions: {
remarkPlugins: [],
rehypePlugins: [],
},
}}
/>
)
}
const Page = async ({
params,
}: {
params: Promise<{
slug: string;
}>;
}) => {
const { slug } = await params;
const content = await getContent(slug)
// ^^^ any fs function
return <MDXContent source={content}>
}
export default Page;next-mdx-remote simplified MDX compilation methods, but the remaining plugins and components had to be implemented yourself. In the past, when updating packages, I needed to ensure remark and rehype plugins could be integrated, which also took considerable maintenance time. So later I looked at Fumadocs, and its default plugins and components matched my needs.
Fumadocs is designed specifically for documentation websites by default, but because it provides multiple headless methods and components, I quickly built my current blog architecture. I write articles in the backend, compile them on the frontend, and it handles code block and table syntax by default, greatly reducing my maintenance time.
import { compileMDX } from "@fumadocs/mdx-remote";
import { FumadocsComponents, V1MDXComponents } from "./mdx-components";
const Page = async ({
params,
}: {
params: Promise<{
slug: string;
}>;
}) => {
const { slug } = await params;
const content = await getContent(slug)
// ^^^ get content from database
const compiled = await compileMDX({
source: content,
components: {
...FumadocsComponents,
...V1MDXComponents,
},
mdxOptions: {
remarkPlugins: [],
rehypePlugins: [],
},
});
return <compiled.body />
}Drizzle is my current ORM tool of choice. It's developed directly with TypeScript and doesn't require any engine integration at the base layer. Since the bundled size is only 31 KB, it's also suitable for running on serverless servers. It supports both relational and SQL-like query methods.
const result = await db.query.users.findFirst({
where: (users, { eq }) => eq(users.id, dto.userId),
with: {
posts: true
},
});const result = await db
.select()
.from(countries)
.leftJoin(cities, eq(cities.countryId, countries.id))
.where(eq(countries.id, 10))Additionally, Drizzle supports various commands through drizzle-kit, such as migration and studio (database management interface)
pnpm drizzle-kit generate
pnpm drizzle-kit migrate
pnpm drizzle-kit push
pnpm drizzle-kit pull
pnpm drizzle-kit check
pnpm drizzle-kit up
pnpm drizzle-kit studioDrizzle's design emphasizes high performance and type safety, providing a headless approach. Since its usage is more open and free, I think Drizzle is more suitable for people with SQL experience.
Actually, the project initially used Prisma as the ORM tool. I think Prisma's user experience is very good, especially its schema definition is more readable and can be centrally unified in the schema.prisma file.
generator client {
provider = "prisma-client-js"
previewFeatures = ["jsonProtocol"]
}
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
enum Role {
ADMIN
USER
}
model Post {
id String @id @default(cuid())
slug String @unique
title String
excerpt String
tags String[]
headImg String?
createdAt DateTime @default(now())
updatedAt DateTime @updatedAt
readTime Int?
readingMins String?
published Boolean @default(false)
content String
user User @relation(fields: [userId], references: [id], onDelete: Cascade)
userId String
}Additionally, its query syntax is also easy to understand, and it also supports various commands for database management.
const result = await prisma
.post
.findMany({
take: opts.input.take,
skip: opts.input.skip,
orderBy: { [opts.input.orderBy]: opts.input.sortOrder },
})Version Note: The Prisma version used at that time was before 5.0.0, when Prisma's underlying architecture hadn't been rewritten and only supported one schema file.
The new architecture has been refactored with TypeScript and supports multiple schema syntax. If you're considering using Prisma now, these issues have been improved.
However, precisely because only one schema file could be written at that time, this made it difficult if I had multiple databases or schemas to connect, especially for multi-tenancy architecture. Besides this, I think the bigger issue was its underlying Rust engines. Yes, every time we run prisma generate, it generates binary code in the project, which also caused frequent deployment issues in the past, such as serverless functions being too large to deploy on Vercel.
So later I rewrote my project using Drizzle. But I think if you're a backend beginner, Prisma can help you get started more quickly; but if you want to optimize your database architecture, you can try Drizzle.
| Feature | Drizzle | Prisma |
|---|---|---|
| Bundle Size | 31 KB | Larger (includes Rust engine) |
| Query Methods | SQL-like and Relational | Prisma Client API |
| Schema Definition | TypeScript | Prisma Schema Language |
| Multi-Schema Support | Native support | Prisma 5.0+ support |
| Type Safety | Fully type-safe | Fully type-safe |
| Learning Curve | Requires SQL basics | Easier to get started |
| Serverless Friendly | Excellent (no engine) | Good (but has engine) |
| Customization Level | High (close to native SQL) | Medium (through Prisma API) |
| Target Users | Developers with SQL experience | Backend beginners or those pursuing rapid development |
Finally, my choice for UI. This project started development with Tailwind CSS, but initially there weren't many UI kits developed specifically for Tailwind. In the past, many UI components had to be implemented myself. Initially there was Shadcn UI, which uses Radix UI plus Tailwind to create open-source UI components. We can directly see the source code and copy it into projects and customize style appearances. At that time, I wrote my own UI kit using Shadcn UI's architecture to simplify the time spent adjusting className, and they hadn't provided CLI for importing yet, so I imported it as a package (ChiaStack - though I never had time to maintain and develop it later). Later HeroUI came out and better provided solutions to solve these problems for me.
HeroUI (formerly NextUI) is rewritten with Tailwind at the base layer (initially v1 used their own style architecture). We can directly use Tailwind's className for customization. Additionally, its underlying animations are written with Motion, and we can also customize component effects ourselves using motionProps.
const Component = () => {
return (
<Button
onPress={() =>
startTransition(async () => {
await authClient.signIn.social({
provider: Provider.google,
callbackURL: getCurrentDomain(),
});
})
}
isLoading={isPending}
variant="flat"
color="primary"
isIconOnly
className="mb-5 mt-auto h-12 w-1/2 p-2"
>
<Icon icon="flat-color-icons:google" width={28} />
</Button>
)
}And its default style is also an appearance I really like, so I later decisively imported it to speed up my frontend and backend interface development.
This year-long refactoring journey has given me a deeper understanding of full-stack development. From Turborepo's monorepo architecture, tRPC's type safety, Hono's lightweight efficiency, to Better Auth's security mechanisms, Fumadocs' document processing, Drizzle's SQL control, and HeroUI's rapid development—each tool choice is based on actual needs and developer experience considerations.
In this process, I learned several key principles:
In the future, I hope to continue expanding this project: