Like many smart home enthusiasts, I ended up with a fragmented setup: smart devices from six different brands, each with its own app. My phone’s smart home folder became a graveyard of icons just to control lights, fans, vacuums, air-conditioners, and doorbells. What I wanted was a single dashboard that could unify all of them. That’s where Home Assistant came in.

What is Home Assistant?
Home Assistant is an open-source home automation platform that can integrate thousands of devices, regardless of brand. Think of it as the universal translator for your smart home. Instead of jumping between multiple apps, you get one dashboard and one brain that can coordinate everything such as lights, ACs, sensors, cameras, speakers, even your robot vacuum.
For me, the first integration I wanted was simple: when my bathroom humidity spikes after a shower, trigger my bedroom air-conditioner to switch to dry mode. This dries the ensuite bathroom quickly and reduces the chance of mold, without me having to buy a bulky dehumidifier. Unfortunately, LG’s ThinQ app doesn’t talk to Samsung SmartThings, so this automation was impossible... until Home Assistant.
Why Docker on Synology
Home Assistant can run on a Raspberry Pi, virtual machine, or as a supervised OS install. I chose to run it in Docker on my Synology DS218+ NAS. The reason was simple: I didn’t want another piece of hardware humming away in my small apartment. My Synology NAS already runs 24/7, so why not host it there?
Spinning Up the Container
Installing Home Assistant via Synology’s Container Manager was straightforward.

I wanted everything codified, so I wrote a docker-compose.yml
file:
services:
homeassistant:
container_name: homeassistant
image: "ghcr.io/home-assistant/home-assistant:stable"
restart: unless-stopped
dns:
- 192.168.1.1 # Optional: If you want Home assistant to resolve local hostnames, make it use your local DNS
networks:
- dockernet
volumes:
- /volume1/docker/homeassistant/config:/config
- /etc/localtime:/etc/localtime:ro
- /run/dbus:/run/dbus:ro
privileged: true
ports:
- 8123:8123
networks:
dockernet:
name: dockernet
driver: bridge
ipam:
config:
- subnet: 172.10.0.0/16
With the container up and running, the next step was making it accessible from anywhere.
Making it Publicly Accessible
1. Custom Hostname via Terraform
I created ha.example.com
in Route53 so I didn’t have to remember IP:port combinations.
resource "aws_route53_record" "home" {
zone_id = local.hosted_zone_id
name = "ha.example.com"
type = "A"
ttl = 86400
records = [var.my_public_ip]
}
⚠️ Pitfall: Not everyone has a static IP and if your ISP changes your public IP, this setup breaks.
- I built an open-source project auto-route53 that automatically updates your Route53 A record whenever your IP changes. It runs as an AWS Lambda function and hooks into Synology’s DDNS settings.
- If you don’t have your own domain, a dynamic DNS service (like DuckDNS) works fine too. Or you can pay for Home Assistant Cloud.
2. Reverse Proxy in DSM
Synology’s built-in reverse proxy lets me map:
https://ha.example.com
→ localhost:8123
.

3. Router Port Forwarding
I forwarded port 443 (HTTPS) on my router to the Synology NAS.

4. SSL Certificate
A Let’s Encrypt certificate secured the connection.

At this point, the flow looks like this:

Fixing Reverse Proxy Errors
The first time I tried accessing it, Home Assistant threw this error:
homeassistant | ERROR (MainThread) [homeassistant.components.http.forwarded]
A request from a reverse proxy was received from 172.10.0.1, but your HTTP integration is not set-up for reverse proxies
This happens because HA doesn’t automatically trust forwarded headers like X-Forwarded-For
. The fix was simple: in configuration.yaml
, add the subnet of the reverse proxy:
http:
use_x_forwarded_for: true
trusted_proxies:
- 172.10.0.0/16
After restarting, that issue was gone.
Next came another roadblock. The UI displayed:
Something went wrong loading onboarding, try refreshing.
A quick peek at the developer console confirmed it was a WebSocket problem. Home Assistant’s WebSocket connections weren’t passing through the reverse proxy. The solution was to explicitly forward upgrade requests. In DSM’s reverse proxy custom headers:
Header Name | Value |
---|---|
Upgrade | $http_upgrade |
Connection | $connection_upgrade |

Once I added those, the WebSocket handshake succeeded and the UI loaded perfectly.
Storage and Performance Considerations
One thing I hadn’t anticipated: Home Assistant stores detailed telemetry of every device. This meant my large 12TB NAS hard drive was constantly active and could no longer spin down. That’s bad news for both power usage and drive lifespan.
To fix this, I bought a Ugreen 2.5" SSD enclosure and salvaged an old 500GB Samsung EVO SSD from my desktop. I moved Docker and its volumes over to the SSD. This was a long and complicated process (worthy of its own article), but the result was worth it:
- The SSD now handles the constant reads/writes from Home Assistant and other containers.
- My NAS hard drive can go idle more often, preserving its lifespan.
Of course, SSDs wear out too, especially under 24/7 loads. But for my setup, it’s a worthwhile trade-off.
Backups
Backups are where “weekend tinkering” crosses into “production engineering.”
-
Home Assistant Backups: I used the AWS S3 integration to back up configs to a dedicated S3 bucket. With Terraform, I created a service account restricted to just that bucket.
-
Important: never use an admin account or a shared bucket. Principle of least privilege matters.
This is how I did it in Terraform:
resource "aws_iam_user" "ha_backup_user" {
provider = aws.iam_manager
name = "home-assistant-backup"
path = "/service-accounts/"
tags = {
Purpose = "Home Assistant S3 Backup"
}
}
resource "aws_iam_access_key" "ha_backup_key" {
provider = aws.iam_manager
user = aws_iam_user.ha_backup_user.name
}
resource "aws_iam_policy" "ha_backup_access" {
provider = aws.iam_manager
name = "HomeAssistantS3Backup"
description = "Allow Home Assistant to use ha-backup-bucket"
policy = jsonencode({
Version = "2012-10-17",
Statement = [
{
Sid = "AllowS3BackupOperations",
Effect = "Allow",
Action = [
"s3:ListBucket",
"s3:GetObject",
"s3:PutObject",
"s3:DeleteObject",
"s3:AbortMultipartUpload"
],
Resource = [
"arn:aws:s3:::${aws_s3_bucket.ha_backup.bucket}",
"arn:aws:s3:::${aws_s3_bucket.ha_backup.bucket}/*"
]
}
]
})
}
resource "aws_iam_user_policy_attachment" "ha_backup_policy_attach" {
provider = aws.iam_manager
user = aws_iam_user.ha_backup_user.name
policy_arn = aws_iam_policy.ha_backup_access.arn
}
Container Volume Backups: My Synology runs multiple containers, so I wanted a unified backup strategy. Originally, I planned to use Hyper Backup to send Docker volumes directly to S3.
The problem: Hyper Backup doesn’t back up data stored on external USB drives. Since my volumes now live on the SSD, I had to improvise.
- I installed Synology’s USB Copy package.
- Scheduled it to copy volumes from the SSD to my secondary HDD.
- Then, from the HDD, Hyper Backup runs as usual to S3.
It’s not elegant, but it works, and redundancy matters more.


Security Notes
Exposing Home Assistant to the internet requires extra care:
- Always use HTTPS with a valid SSL certificate.
- Enable 2FA in Home Assistant for your account.
- Consider firewall rules, VPN access, or Cloudflare Tunnel if you want stricter access control.
If Home Assistant is compromised, attackers may gain control of cameras, microphones, and even door locks connected to it.
Since it sits inside your network, a vulnerability could also give them a foothold to move laterally and access other devices if your network isn’t properly segmented.
My Dashboard
After ironing out these setup issues, I finally built my Home Assistant dashboard.

It was incredibly satisfying to see all my devices on one dashboard instead of juggling six different apps.
Reflections and Next Steps
This was my first attempt at documenting one of my tinkering projects. Writing it helped me clarify my own process, and I hope it helps anyone else trying to wrangle a messy smart home into something coherent.
The setup itself was a reminder that technology always has quirks, and the fun lies in figuring them out. There were error messages, dead ends, and plenty of Googling. But the moment the dashboard came alive, it was worth every bit of troubleshooting.
Next, I’ll dive into automations, starting with the one that triggered this whole journey: using a humidity sensor in my ensuite bathroom to switch my bedroom AC into dry mode after a shower.