On peut "remettre" ou plutot rendre disponible de nouveau un message SQS en changeant sa visibility timeout à 0
-
http://boto3.readthedocs.io/en/latest/reference/services/sqs.html#SQS.Client.change_message_visibilityaws s3 ls s3://bucket/path/ --recursive --summarize | grep "Total Objects:"
-
https://links.infomee.fr/?rcG0hgTruc con : on ne peut pas utiliser le résultat du get-repository-policy dans le set-repository-policy pour cloner.
Il faut au passage enlever les \n qui trainent dans la réponse
-
http://docs.aws.amazon.com/cli/latest/reference/ecr/set-repository-policy.htmlpython3 log to file AND stdout
import logging
logging.basicConfig(handlers=[logging.FileHandler('/var/log/runner/process1.log'),logging.StreamHandler()],format='%(asctime)s %(levelname)s %(message)s',level=logging.INFO)
logging.info('foo')
Encore mieux pour supporter le logrotate sans copytruncate :
import logging.handlers
logging.basicConfig(handlers=[logging.handlers.WatchedFileHandler('/var/log/worker/worker1.log'),logging.StreamHandler()],format='%(asctime)s %(levelname)s %(message)s',level=logging.INFO)
/var/log/worker/*.log {
monthly
rotate 12
compress
delaycompress
missingok
notifempty
create 644 root root
}
Python 2:
import logging as loggingg
logging = loggingg.getLogger('simple_example')
logging.setLevel(loggingg.INFO)
formatter = loggingg.Formatter('%(asctime)s %(levelname)s %(message)s')
console_handler = loggingg.StreamHandler()
console_handler.setLevel(loggingg.INFO)
console_handler.setFormatter(formatter)
file_handler = loggingg.FileHandler('/var/log/worker/worker3.log')
file_handler.setLevel(loggingg.INFO)
file_handler.setFormatter(formatter)
logging.addHandler(console_handler)
logging.addHandler(file_handler)
-
https://docs.python.org/3/howto/logging.html:~$ docker -H tcp://10.73.204.73:2375 ps
Error response from daemon: client is newer than server (client API version: 1.24, server API version: 1.19)
:~$ DOCKER_API_VERSION=1.19 docker -H tcp://x.x.x.x:xxxx ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
-
https://serverfault.com/questions/664999/newer-docker-client-with-older-docker-hostwtf?
cd dir
time perl -e 'for(<*>){((stat)[9]<(unlink))}'
-
http://www.slashroot.in/which-is-the-fastest-method-to-delete-files-in-linuxWhen using programs that use GNU Parallel to process data for publication please cite:
O. Tange (2011): GNU Parallel - The Command-Line Power Tool,
;login: The USENIX Magazine, February 2011:42-47.
This helps funding further development; and it won't cost you a cent.
Or you can get GNU Parallel without this requirement by paying 10000 EUR.
To silence this citation notice run 'parallel --bibtex' once or use '--no-notice'.
-
https://links.infomee.fr/?OqiirQfind -L . -type f | parallel -j 30 rsync -a {} /DESTINATION_EFS_FILESYSTEM
-
https://links.infomee.fr/?bSM2ywfor branch in git branch -r | grep -v HEAD;do echo -e git show --format="%ci %cr" $branch | head -n 1 \t$branch; done | sort -r
-
https://gist.github.com/jasonrudolph/1810768So you can put your data into glacier with 2 differents ways:
1) directly to glacier via API
2) Store them to s3 then with a management policy, it'll go to Glacier
Warning : Huge cost when you download from glacier and when you delete before 3 months
-
https://www.cloudberrylab.com/blog/compare-amazon-glacier-direct-upload-and-glacier-upload-through-amazon-s3/CloudFormer to create AWS CloudFormation templates from existing AWS resources
-
http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/cfn-using-cloudformer.htmltmux <3
while true; do python monitor_beanstalk.py; bg_color=$([ $? == 0 ] && echo "green" || echo "red"); tmux set-window-option -t${TMUX_PANE} window-status-bg $bg_color; sleep 30; clear; done
-
https://links.infomee.fr/?OlX-xw
-
https://links.infomee.fr/?eFFzMgaws efs describe-file-systems| jq '.FileSystems|.[]|[.Name, .SizeInBytes.Timestamp, .SizeInBytes.Value]' -c
Retourne une ligne par EFS
Sur chaque ligne, un array avec :
[0] = nom de l'efs
[1] = timestamp du moment où la taille a été calculée
[2] = la taille en bytes
Pour avoir la taille en GB :
aws efs describe-file-systems| jq '.FileSystems|.[]|[.Name, .SizeInBytes.Timestamp, .SizeInBytes.Value / 1024 /1024 / 1024]' -c
aws efs describe-file-systems| jq '.FileSystems|.[]|[.Name, .SizeInBytes.Value / 1024 /1024 / 1024]' -c
-
https://links.infomee.fr/?MzuW2A