Technical Knowledge

Full-Stack Development, Cloud Architecture, and Modern Technologies

Bill's Technology Expertise Visualization - Animation End Frame

Introduction

Heres is a selection of the my skills.

Languages

TypeScript Logo JavaScript Logo C Sharp Logo Python Logo Rust Logo C Plus Plus Logo Java Logo Kotlin Logo Golang Logo

TypeScript & JavaScript

Five years of extensive experience in the TypeScript ecosystem, primarily with Node.js, React, and Next.js. Specialized in API development with Express.js and Zod, with comprehensive knowledge of the modern frontend toolchain. Loving the latest effect TS library.

C#

Five years developing desktop applications, web APIs, and IoT clients. Recent projects include building REST APIs using .NET Minimal APIs with FluentValidation.

Python

Utilized for metaprogramming, machine learning with PyTorch, and LLM applications. Recent project: developing a subtitle generator using Whisper for complex accent recognition.

Rust

Primary language for electronics projects, with experience building REST APIs using actix-web. Currently transitioning Rust expertise into commercial applications.

C/C++

Over a decade of professional experience, from Oracle OCI-powered billing engines to Kodi plugin optimization. Proficient with MSVC, GCC, and Clang, with recent work on embedded systems using Nordic's SDK and Zephyr.

Java & Kotlin

Ten years of Java experience spanning Swing frontends, Android development, and Spring Boot/JPA REST APIs. Recently focused on Kotlin for Android development with Jetpack Compose.

Go

Intermittent experience with Go for its clean, fast, and straightforward approach. Recent work with Gitea and Gorilla frameworks.

Additional Languages

Working knowledge of Scala, Ruby, Swift, Lisp, Haskell (including REST API development with Scotty), and Verilog for FPGA programming.

Frameworks

Angular Logo Vue Logo Nuxt Logo React Logo Next.js Logo Jest Logo Cypress Logo PHP Logo Express Logo Spring Logo

Angular

Built my own website using Angular, and later came back to it for a fun chessboard game experiment using SSR with incremental hydration. Angular may be verbose, but it's powerful when you need structure.

Vue.js & Nuxt

Created several personal projects using Vue's component-based architecture. Nuxt added a nice SSR layer when needed. I enjoy how Vue keeps things clean and makes progressive enhancement feel natural. Recent projects include building data visualization dashboards with Vue 3's Composition API and leveraging Nuxt 3's hybrid rendering for optimized performance.

React & Next.js

Five years of React experience, with a recent focus on Next.js for its flexible server/client component model. Usually pair it with Node.js APIs, but also comfortable using Next’s built-in API routes and Actions when it makes sense.

Jest and Cypress Testing

I’m a big believer in testing. Used Jest, Cypress, and React Testing Library to keep things solid, with coverage tracked in the CI/CD pipeline. Hitting good coverage (where it counts) is a nice signal things are on track. Having Node.js on both sides made it easy to keep tests flowing end-to-end.

PHP

Two years of PHP experience, it did have a bad name but I am seeing a lot of take up of this in recent years. I have written a React Component for my wife's wordpress recently too.

Express

Used Express extensively as the backend layer for Node.js projects, especially when pairing with React or Next.js frontends. I appreciate how lightweight and unopinionated it is—it lets me shape the architecture without fighting defaults. Built several REST APIs with middleware chains that handle auth, logging, and error hygiene. Recently explored Express with server-side rendering and streaming responses for faster perceived load times.

Spring Boot

Built production-grade REST APIs using Spring Boot, implementing OAuth2 for secure authentication and Redis for caching. I've also developed a gRPC-based service for my home project using Spring Boot Starter, integrating OpenTelemetry to stream telemetry data to Honeycomb. Spotless keeps code quality tight, and I appreciate how Spring Boot balances convention with flexibility. I always make lifecycle and dependency wiring explicit to ensure long-term maintainability

Databases

PostgreSQL Logo Snowflake Logo MSSQL Logo Oracle Logo DB2 Logo MongoDB Logo

Extensive expertise in both traditional RDBMS and cloud-native database architectures. Proven success optimizing Oracle performance through memory-pinned tables and advanced tuning techniques. Deep understanding of partitioning strategies, indexing, cardinality estimation, and diagnostic tools such as EXPLAIN PLAN. Experienced in leveraging blob storage solutions for scalable image management.

Relational Databases

  • Azure SQL: Integration with Microsoft cloud services
  • PostgreSQL: Primary database for cloud applications on AWS RDS
  • Snowflake: Data warehousing solutions
  • MS SQL Server: On Premise or cloud hosted applications
  • Oracle: Complex transactional systems
  • DB2: AS/400 and older systems

NoSQL Solutions

  • MongoDB: Document-based storage for flexible schema requirements

Cloud Infrastructure & DevOps

Azure Logo AWS Logo

Container Orchestration

Experienced in managing Kubernetes clusters both in AWS and locally using MicroK8s on Ubuntu.

Proficient in deploying and managing applications using Helm charts, with a focus on Configured ingress with Nginx, automated certificate management with CertBot, and deployed applications using Helm charts.

AWS

Experience with:

  • EKS (Elastic Kubernetes Service)
  • EC2 (Elastic Compute Cloud)
  • Lambda serverless functions
  • API Gateway
  • RDS (Relational Database Service)
  • CloudWatch monitoring
  • Serverless Framework deployment
  • Advanced networking configuration

Azure

Proficient with core Azure services including:

  • Service Bus for messaging architecture
  • Key Vault for secrets management
  • OAuth implementation
  • Azure SQL for database management
  • Blob Storage for object storage
  • Azure Pipelines for CI/CD workflows

Machine Learning

PyTorch Logo TensorFlow Logo

Machine Learning (practical) — PyTorch & TensorFlow notebooks, Docker/requirements for reproducible runs, and containerized inference examples.

  • Reproducible: Jupyter notebooks + Docker/requirements + seeded experiments.
  • Production-aware: experiment tracking (MLflow / W&B) and containerized serving (TorchServe / TF‑Serving / ONNX).
  • Results: trained CNNs & classifiers on Kaggle and local datasets — metrics and inference times available in linked notebooks.

Notebooks & runnable examples: TensorFlow notebooks · PyTorch notebooks. Demo/serving examples are included where available in the repositories.

Started getting to grips with TensorFlow and PyTorch — two frameworks with very different personalities but the same purpose. Working in Jupyter Notebooks inside VS Code made it easy to run code, record notes, and plot results.

Handled platform compatibility (CUDA/ROCm) to enable GPU training on my machine.

Build notes (TensorFlow & CUDA)

Getting TensorFlow running took a bit more effort: I had to build from source to support CUDA 13 and my RX5090 via WSL2 (since it needed sm_10), but the challenge was part of the fun. The build included compiling the runtime with the correct compute capability flags and validating GPU execution in the WSL2 environment.

For both frameworks I built and trained models for linear regression, classification, and CNNs, as well as experimenting with Kaggle datasets and pre-trained models to see how they performed.

PyTorch

I quickly produced examples for linear regression, classification, and simple neural networks. PyTorch felt natural — you can see and control what’s happening, and visualising data and results in Jupyter is effortless. Plotting loss and accuracy with matplotlib made experimentation quick and satisfying.

TensorFlow

TensorFlow came second, but I can see why it’s so popular. It’s straightforward to define and train models — sensible defaults and no explicit loops required. That said, I enjoyed writing those steps out manually in PyTorch; it helped me understand how the training process really works. I haven’t made TensorFlow device-agnostic yet, mainly because it just worked out of the box in my setup.

Reflections

Working with both frameworks side-by-side was ideal — same data, two different approaches. It’s incredibly easy in Python to visualise and compare outputs, and that makes learning rewarding. I’m not an expert yet, but the journey’s underway and I’ll keep investing time into it.

A big shout-out to the excellent CNN Explainer.

Personal Infrastructure

Maintain a sophisticated home lab environment running:

  • Postfix and Dovecot for email services
  • Grafana for monitoring and analytics
  • Caddy as a reverse proxy and SSL automation
  • MediaWiki for documentation
  • RoundCube webmail
  • Elasticsearch for search functionality
  • MicroK8s for container orchestration