Setting Up Home Assistant on My Synology DS218+

One Dashboard to Rule Them All

(Updated ) · 8 min read

Like many smart home enthusiasts, I ended up with a fragmented setup: smart devices from six different brands, each with its own app. My phone’s smart home folder became a graveyard of icons just to control lights, fans, vacuums, air-conditioners, and doorbells. What I wanted was a single dashboard that could unify all of them. That’s where Home Assistant came in.

Screenshot of a smartphone folder showing icons for SmartThings, Ring, Tapo, Home, LG ThinQ, BroadLink, and Home
Assistant.
My smartphone home folder before I set up Home Assistant.

What is Home Assistant?

Home Assistant is an open-source home automation platform that can integrate thousands of devices, regardless of brand. Think of it as the universal translator for your smart home. Instead of jumping between multiple apps, you get one dashboard and one brain that can coordinate everything such as lights, ACs, sensors, cameras, speakers, even your robot vacuum.

For me, the first integration I wanted was simple: when my bathroom humidity spikes after a shower, trigger my bedroom air-conditioner to switch to dry mode. This dries the ensuite bathroom quickly and reduces the chance of mold, without me having to buy a bulky dehumidifier. Unfortunately, LG’s ThinQ app doesn’t talk to Samsung SmartThings, so this automation was impossible... until Home Assistant.

Why Docker on Synology

Home Assistant can run on a Raspberry Pi, virtual machine, or as a supervised OS install. I chose to run it in Docker on my Synology DS218+ NAS. The reason was simple: I didn’t want another piece of hardware humming away in my small apartment. My Synology NAS already runs 24/7, so why not host it there?

Spinning Up the Container

Installing Home Assistant via Synology’s Container Manager was straightforward.

Screenshot of the Home Assistant container in Synology Container Manager.
Home Assistant container running in Synology's Container Manager.

I wanted everything codified, so I wrote a docker-compose.yml file:

yaml
services:
  homeassistant:
    container_name: homeassistant
    image: "ghcr.io/home-assistant/home-assistant:stable"
    restart: unless-stopped
    dns:
      - 192.168.1.1 # Optional: If you want Home assistant to resolve local hostnames, make it use your local DNS
    networks:
      - dockernet
    volumes:
      - /volume1/docker/homeassistant/config:/config
      - /etc/localtime:/etc/localtime:ro
      - /run/dbus:/run/dbus:ro
    privileged: true
    ports:
      - 8123:8123
 
networks:
  dockernet:
    name: dockernet
    driver: bridge
    ipam:
      config:
        - subnet: 172.10.0.0/16

With the container up and running, the next step was making it accessible from anywhere.

Making it Publicly Accessible

1. Custom Hostname via Terraform

I created ha.example.com in Route53 so I didn’t have to remember IP:port combinations.

terraform
resource "aws_route53_record" "home" {
  zone_id = local.hosted_zone_id
  name    = "ha.example.com"
  type    = "A"
  ttl     = 86400
  records = [var.my_public_ip]
}

⚠️ Pitfall: Not everyone has a static IP and if your ISP changes your public IP, this setup breaks.

2. Reverse Proxy in DSM

Synology’s built-in reverse proxy lets me map: https://ha.example.comlocalhost:8123.

Screenshot of Synology Reverse Proxy Setting

3. Router Port Forwarding

I forwarded port 443 (HTTPS) on my router to the Synology NAS.

Screenshot of router port forwarding settings

4. SSL Certificate

A Let’s Encrypt certificate secured the connection.

Screenshot of Let's Encrypt certificate

At this point, the flow looks like this:

Network flow diagram

Fixing Reverse Proxy Errors

The first time I tried accessing it, Home Assistant threw this error:

homeassistant | ERROR (MainThread) [homeassistant.components.http.forwarded]
A request from a reverse proxy was received from 172.10.0.1, but your HTTP integration is not set-up for reverse proxies

This happens because HA doesn’t automatically trust forwarded headers like X-Forwarded-For. The fix was simple: in configuration.yaml, add the subnet of the reverse proxy:

yaml
http:
  use_x_forwarded_for: true
  trusted_proxies:
    - 172.10.0.0/16

After restarting, that issue was gone.

Next came another roadblock. The UI displayed: Something went wrong loading onboarding, try refreshing.

A quick peek at the developer console confirmed it was a WebSocket problem. Home Assistant’s WebSocket connections weren’t passing through the reverse proxy. The solution was to explicitly forward upgrade requests. In DSM’s reverse proxy custom headers:

Header NameValue
Upgrade$http_upgrade
Connection$connection_upgrade
Screenshot of DSM reverse proxy header settings

Once I added those, the WebSocket handshake succeeded and the UI loaded perfectly.

Storage and Performance Considerations

One thing I hadn’t anticipated: Home Assistant stores detailed telemetry of every device. This meant my large 12TB NAS hard drive was constantly active and could no longer spin down. That’s bad news for both power usage and drive lifespan.

To fix this, I bought a Ugreen 2.5" SSD enclosure and salvaged an old 500GB Samsung EVO SSD from my desktop. I moved Docker and its volumes over to the SSD. This was a long and complicated process (worthy of its own article), but the result was worth it:

Of course, SSDs wear out too, especially under 24/7 loads. But for my setup, it’s a worthwhile trade-off.

Backups

Backups are where “weekend tinkering” crosses into “production engineering.”

This is how I did it in Terraform:

terraform
resource "aws_iam_user" "ha_backup_user" {
  provider = aws.iam_manager
  name     = "home-assistant-backup"
  path     = "/service-accounts/"
  tags = {
    Purpose = "Home Assistant S3 Backup"
  }
}
 
resource "aws_iam_access_key" "ha_backup_key" {
  provider = aws.iam_manager
  user     = aws_iam_user.ha_backup_user.name
}
 
resource "aws_iam_policy" "ha_backup_access" {
  provider    = aws.iam_manager
  name        = "HomeAssistantS3Backup"
  description = "Allow Home Assistant to use ha-backup-bucket"
  policy = jsonencode({
    Version = "2012-10-17",
    Statement = [
      {
        Sid    = "AllowS3BackupOperations",
        Effect = "Allow",
        Action = [
          "s3:ListBucket",
          "s3:GetObject",
          "s3:PutObject",
          "s3:DeleteObject",
          "s3:AbortMultipartUpload"
        ],
        Resource = [
          "arn:aws:s3:::${aws_s3_bucket.ha_backup.bucket}",
          "arn:aws:s3:::${aws_s3_bucket.ha_backup.bucket}/*"
        ]
      }
    ]
  })
}
 
resource "aws_iam_user_policy_attachment" "ha_backup_policy_attach" {
  provider   = aws.iam_manager
  user       = aws_iam_user.ha_backup_user.name
  policy_arn = aws_iam_policy.ha_backup_access.arn
}

Container Volume Backups: My Synology runs multiple containers, so I wanted a unified backup strategy. Originally, I planned to use Hyper Backup to send Docker volumes directly to S3.

The problem: Hyper Backup doesn’t back up data stored on external USB drives. Since my volumes now live on the SSD, I had to improvise.

It’s not elegant, but it works, and redundancy matters more.

Screenshot of USB Copy schedule
USB Copy should run first
Screenshot of Hyper Backup task
Hyper Backup should run second

Security Notes

Exposing Home Assistant to the internet requires extra care:

If Home Assistant is compromised, attackers may gain control of cameras, microphones, and even door locks connected to it.

Since it sits inside your network, a vulnerability could also give them a foothold to move laterally and access other devices if your network isn’t properly segmented.

My Dashboard

After ironing out these setup issues, I finally built my Home Assistant dashboard.

Screenshot of my Home Assistant Dashboard

It was incredibly satisfying to see all my devices on one dashboard instead of juggling six different apps.

Reflections and Next Steps

This was my first attempt at documenting one of my tinkering projects. Writing it helped me clarify my own process, and I hope it helps anyone else trying to wrangle a messy smart home into something coherent.

The setup itself was a reminder that technology always has quirks, and the fun lies in figuring them out. There were error messages, dead ends, and plenty of Googling. But the moment the dashboard came alive, it was worth every bit of troubleshooting.

Next, I’ll dive into automations, starting with the one that triggered this whole journey: using a humidity sensor in my ensuite bathroom to switch my bedroom AC into dry mode after a shower.