Storing Laravel database backups on Amazon S3
Backups are important. Keeping regular backups of your (client's) website's databases can be a life saver in case of unexpected events. Hopefully you'll never have to use them, but having them around, knowing you can do a rollback to yesterday's state in minutes does give you some ease of mind.
Create an Amazon S3 Bucket
S3 buckets are a very robust storage medium widely used in the Laravel community. The good thing is they are easy to use and come relatively cheap. Creating one does require you to have an AWS account.
- In AWS console navigate to S3 > create bucket
- Choose a bucket name: s3-your-bucket-name
- Choose a bucket region: (eu-central-1)
- block all public access
- bucket versioning: disable
- add tags (optional)
- key: client
- value: client-name
- default encryption: disable
Create an IAM Policy
Creating an AWS policy allows you to give certain permissions to certain users.
-
In AWS console navigate to IAM > Policies > Create Policy
-
Add json policy
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": ["s3:ListAllMyBuckets", "s3:GetBucketLocation"], "Resource": "*" }, { "Effect": "Allow", "Action": "s3:ListBucket", "Resource": "arn:aws:s3:::s3-your-bucket-name", "Condition": { "StringLike": { "s3:prefix": ["", "${aws:username}/*"] } } }, { "Effect": "Allow", "Action": "s3:*", "Resource": [ "arn:aws:s3:::s3-your-bucket-name/${aws:username}", "arn:aws:s3:::s3-your-bucket-name/${aws:username}/*" ] } ] }
-
Click next, add:
- NAME:
S3AllowAllActionsInUserFolder-[s3-your-bucket-name]
- DESCRIPTION: Full read and write access in the bucket starting from the folder named after the username.
- NAME:
-
Click 'create policy'
Create a User Group
An IAM User Group is an effective way to assign one or more policies to your users.
- In AWS console navigate to IAM > Groups > Create Group
- name:
S3BucketUser-[s3-your-bucket-name]
- policies:
S3AllowAllActionsInUserFolder-[s3-your-bucket-name]
Create an IAM user
Now we will create the user that we will actually use to manage our database backups.
Note: AWS has recently updated their IAM user creation flow, instructions below are updated.
- In AWS console navigate to IAM > Users > Add User
- Define a username: your-user
- DO NOT select Enable console access !
- click
Next
- Add user to the group you just created:
S3BucketUser-[s3-your-bucket-name]
-
Next
,Next
,create user
- The user has been created, now we need to generate an access key. Select the user and click on the
Security Credentials
tab. - Scroll down to
Access Keys
and clickCreate access key
- Select the radio button next to
Application running outside AWS
and clickNext
- Add a description (optional)
- Copy the Access key & Secret and store them both securely !
Configure S3 bucket in Laravel
To allow Laravel to access our s3 bucket we need to install a utility package. This is also explained on the official Laravel documentation.
-
Install S3 flysystem driver
For Laravel >=9:
composer require league/flysystem-aws-s3-v3:"^3.0"
Note: Older Laravel versions do not support flysystem aws s3 v3. See: https://stackoverflow.com/questions/65002425/league-flysystem-aws-s3-v3-on-laravel-8-other-packages-require-lower-version Use v1 instead:
composer require league/flysystem-aws-s3-v3:"^1.0"
-
Edit .env
AWS_USERNAME= AWS_ACCESS_KEY_ID= AWS_SECRET_ACCESS_KEY= AWS_DEFAULT_REGION= AWS_BUCKET=
-
edit config/filesystems.php, set root folder to the AIM username
's3' => [ 'driver' => 's3', 'key' => env('AWS_ACCESS_KEY_ID'), 'secret' => env('AWS_SECRET_ACCESS_KEY'), 'region' => env('AWS_DEFAULT_REGION'), 'bucket' => env('AWS_BUCKET'), 'url' => env('AWS_URL'), 'endpoint' => env('AWS_ENDPOINT'), 'root' => env('AWS_USERNAME'), # <- ],
Setting up Spatie laravel-backup
This package developed by Spatie has all the tools to create and organise our backups. The package is free to use, one day you'll thank them for this !
-
install laravel-backup package via composer
composer require spatie/laravel-backup
-
publish backup.php config
php artisan vendor:publish --provider="Spatie\Backup\BackupServiceProvider" --tag=backup-config
Edit /config/backup.php
-
define your app name in your .env or set manually
'name' => 'mysql-backup',
-
include files ? Uncomment if you would like to backup your entire filesystem.
/* * The list of directories and files that will be included in the backup. */ 'include' => [ // base_path(), # don't backup any files (they are in git, storage is on S3) ],
-
compression
'database_dump_compressor' => Spatie\DbDumper\Compressors\GzipCompressor::class,
-
destination
'destination' => [ /* * The filename prefix used for the backup zip file. */ 'filename_prefix' => '', /* * The disk names on which the backups will be stored. */ 'disks' => [ // 'local', 's3', // <- ], ],
-
Disable notifications (optional)
'notifications' => [ \Spatie\Backup\Notifications\Notifications\BackupHasFailedNotification::class => [], \Spatie\Backup\Notifications\Notifications\UnhealthyBackupWasFoundNotification::class => [], \Spatie\Backup\Notifications\Notifications\CleanupHasFailedNotification::class => [], \Spatie\Backup\Notifications\Notifications\BackupWasSuccessfulNotification::class => [], \Spatie\Backup\Notifications\Notifications\HealthyBackupWasFoundNotification::class => [], \Spatie\Backup\Notifications\Notifications\CleanupWasSuccessfulNotification::class => [], ],
-
monitor backups
'monitor_backups' => [ [ 'name' => 'mysql-backup', 'disks' => ['s3'], # <= 'health_checks' => [ \Spatie\Backup\Tasks\Monitor\HealthChecks\MaximumAgeInDays::class => 1, \Spatie\Backup\Tasks\Monitor\HealthChecks\MaximumStorageInMegabytes::class => 5000, ], ], ],
Configuring Mysql dump binary path
-
localhost (testing)
in .env:
# PATH TO mysqldump BINARY (MAMP) DB_DUMP_BINARY_PATH="/Applications/MAMP/Library/bin"
/config/database.php
'mysql' => [ 'driver' => 'mysql', 'url' => env('DATABASE_URL'), 'host' => env('DB_HOST', '127.0.0.1'), 'port' => env('DB_PORT', '3306'), 'database' => env('DB_DATABASE', 'forge'), 'username' => env('DB_USERNAME', 'forge'), 'password' => env('DB_PASSWORD', ''), 'unix_socket' => env('DB_SOCKET', ''), 'charset' => 'utf8mb4', 'collation' => 'utf8mb4_unicode_ci', 'prefix' => '', 'prefix_indexes' => true, 'strict' => true, 'engine' => null, 'options' => extension_loaded('pdo_mysql') ? array_filter([ PDO::MYSQL_ATTR_SSL_CA => env('MYSQL_ATTR_SSL_CA'), ]) : [], 'dump' => [ 'dump_binary_path' => env('DB_DUMP_BINARY_PATH', '/usr/bin'), ], ], ...
Note: The mysqldump binary path on your production server may be different, to verify: log in to your production server via ssh and run:
whereis mysqldump
Testing on localhost (optional)
-
run in terminal
php artisan backup:run
Starting backup... Dumping database moreweb-be... Determining files to backup... Zipping 1 files and directories... Created zip containing 1 files and directories. Size is 3.11 KB Copying zip to disk named s3... Successfully copied zip to disk named s3. Backup completed!
Configure daily backup job in forge
I'm using Laravel Forge in this use case, but you can use whatever you want.
-
configure schedule
/app/Console/Kernel.php
protected function schedule(Schedule $schedule) { if (app()->environment('production')) { $schedule->command('backup:clean')->dailyAt('03:00'); $schedule->command('backup:run')->dailyAt('03:05'); } }
-
login to your laravel forge account
-
navigate to your project / site
-
navigate to your server (forge), Scheduler
- command:
php8.1 /home/path/to/artisan schedule:run
- user: your-user
- frequency: Every minute
- command:
-
Edit Environment
AWS_USERNAME= AWS_ACCESS_KEY_ID= AWS_SECRET_ACCESS_KEY= AWS_DEFAULT_REGION= AWS_BUCKET=
-
Deploy App
Happy backing up!