1595
Comment:
|
2142
|
Deletions are marked like this. | Additions are marked like this. |
Line 5: | Line 5: |
== 2018 CloudWatch syslog == * Install agent {{{ $ sudo dpkg -iE amazon-cloudwatch-agent.deb $ sudo systemctl enable amazon-cloudwatch-agent.service $ sudo systemctl start amazon-cloudwatch-agent.service $ sudo systemctl status amazon-cloudwatch-agent.service }}} * --( syslog {{{ curl https://s3.amazonaws.com/aws-cloudwatch/downloads/latest/awslogs-agent-setup.py -O $ sudo python3 awslogs-agent-setup.py awslogs-agent-setup.py }}} )-- * aws cli cloudwatch log groups {{{ $ aws logs describe-log-groups --profile nonprod }}} |
Amazon Web Services - Cloud provider
2018 CloudWatch syslog
Install agent
$ sudo dpkg -iE amazon-cloudwatch-agent.deb $ sudo systemctl enable amazon-cloudwatch-agent.service $ sudo systemctl start amazon-cloudwatch-agent.service $ sudo systemctl status amazon-cloudwatch-agent.service
syslog
curl https://s3.amazonaws.com/aws-cloudwatch/downloads/latest/awslogs-agent-setup.py -O $ sudo python3 awslogs-agent-setup.py awslogs-agent-setup.py
aws cli cloudwatch log groups
$ aws logs describe-log-groups --profile nonprod
2018 aws tool, setup for s3 upload on Raspberry Pi3
- sudo pip3 install -U aws
- Error
- sudo apt install libffi-dev libssl-dev
2016
- on linux can use $s3cmd to backup to S3 storage in AWS.
- $ s3cmd --configure
get keys from http://aws.amazon.com/ User Name,Access Key Id,Secret Access Key
- s3cmd mb s3://backupVigor
Bucket 's3://backupVigor/' created
- Use tar with the incremental option to backup files, pipe through xz to compress, pgp to encrypt if needed and then pipe straight to s3cmd, into backup file.
- 20160322 - This works great, and no need for a local copy of the file while creating the backup.
- $ s3cmd --configure
- Then set AWS policy on s3 bucket to move older files to Glacier, and even delete very old files, e.g. after 700days.
Linux commandline bash upload to aws s3
...