6. Servers and Server Administration

Author

The Jornada Information Management Team

Published

November 2, 2025

Server administration

The Jornada IM system relies on some cloud servers at DigitalOcean (aka droplets). Generally these are running Ubuntu Server 20.04. Below are some tools and practices for setting up servers, networking between them, managing user access, transferring and securing data, and other administrative tasks.

Cloud server setup tasks

Creating new services at DO is easy - it can be done with the dashboard or an API. Documentation for all DO services is available here, and for droplets see the recommended initial setup docs here and here

  1. Add public key to server - usually you can do this on creation or from an admin control panel. If it needs to be done after the fact see here

  2. Add a non-root user with sudo privilege and allow ssh access. More info here:

    • https://www.digitalocean.com/docs/droplets/tutorials/recommended-setup/
    • https://www.digitalocean.com/community/questions/how-to-enable-ssh-access-for-non-root-users
  3. Configure hostnames (if not already done at creation).

    • https://www.digitalocean.com/community/questions/how-do-i-change-hostname
  4. Install software, which varies depending on server purpose.

    • Metabase server: git, cron, PostgreSQL
    • Web servers: probably the LAMP stack and WordPress, maybe some javascript.

Metabase server config

  1. Clone the LTER-core-metabase and jrn-db-utils (private) repositories to a directory in home.
  2. Create a backups directory and mkdir /home/backups/postgresql. Make sure owner is whoever operates backups for postgres
  3. Add a the jrn-db-utils/sh/pb_backup_rotated.sh script to crontab - nightly.
  4. Make sure incoming connections for SSH (TCP port 22) and PostgreSQL (TCP port 5432) are allowed in the firewall (currently using the DO firewall).

Securing cloud servers

Droplets are behind a DO cloud firewall, but if needed, firewalls can also be set up for individual droplets with UFW. This and other tasks are described in the initial server setup docs above, or:

Unattended updates are a good idea also. For Ubuntu install the unnattended-upgrades package and see setup instructions here.

Scheduled tasks

Some server tasks, like database or website backups, should be scheduled with cron.

NFS and CIFS mounts to remote directories

The JORNADA-NETB1 storage block (sometimes called the R drive) allows CIFS connections (or SMB).

Other:

Current backup plan

Termite

  • PostgreSQL databases (jgeo) are dumped to /home/backups/postgresql in a nightly cron-job and daily, weekly, monthly copies are retained.
  • No other backups, and nothing remote/off-premise backups (TODO)

Oryx

  • PostgreSQL databases are backed up nightly to /home/backups/postgresql in a nightly cron-job and a few daily, weekly, monthly copies are retained.
  • /home/gmaurer/ and home/backups have nightly incremental backups to Packrat using Active Backup for Business (ActiveBackupForBusiness share).

JRN Metnet (Loggernet computer)

  • This PC runs the Synology Active Backup for Business client and the data drive (not sure of letter) is backed up nightly to Packrat (ActiveBackupForBusiness share). Daily, weekly, monthly copies of this backup are retained.
  • Conrad and John do some regular backups to Dropbox and external drives, but I’m not sure what those entail.

Sala lab Dropbox

  • This is synced to the Researcher_backups share on Packrat using the Synology Cloud Sync app.

Packrat

There are two backup tasks that back up data from Packrat to a Backblaze S3 bucket:

  • A nightly backup of all data in home, jrn_fieldobs, jrn_geospatial, jrn_imagery, and jrn_sensors
  • A weekly backup of all everything in the ActiveBackupForBusiness and Researcher_backups shares. This will include data from several versioned or incremental backups above (JRN Metnet, Oryx, and Sala lab data)

Laptops

  • Greg’s laptop is backed up to his home Synology device using Time Machine
  • Geovany has backed up his laptop to Packrat.
  • Others?

Cloud Service Provider data stores

For shared files on Google Drive and NMSU OneDrive/Sharepoint we just rely on the CSP system, for better or worse…