1.编写备份文件
创建文件logs_backup.sh,内容如下:
#!/bin/bashexport PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/usr/lib/jvm/java-8-oracle/bin:/usr/lib/jvm/java-8-oracle/db/bin:/usr/lib/jvm/java-8-oracle/jre/binfind /home/www/log/java-app/ -mtime +1 -name "*.gz" -exec cp {} /koala/java-app/foundation8/logs \;find /home/www/log/java-app -mtime +1 -name "*.gz" -exec rm -rf {} \;
将当前一天前的以gz结尾的文件备份到/koala/java-app/foundation8/logs目录下,再全部删除这些文件。-mtime +1是查找一天之前的文件,比如现在是2019年7月29日16点,那么-mtime +1的时间就是2019年7月28日16点之前的时间。
2.设置定时任务:
使用crontab -e命令编辑定时任务
00 2 * * * /bin/bash /koala/java-app/foundation6/shell/logs_backup.sh
这句话的含义是:每天凌晨2点执行备份脚本logs_backup.sh
使用crontab -l命令可以查看我们要执行的定时任务。
补充:
1.将用户的访问日志处理成访问请求结果和请求路径形式:
用户访问nginx的日志格式如下:
60.165.182.107 60.165.182.0 http://server.www.koalareading.com - [27/Jul/2019:23:59:28 +0800] "GET /books/android/spellwords/sortSpellWordsGames?challengeType=1&schoolClassId=137987&sortType=1 HTTP/1.1" 200 0.060 3414 "-" "okhttp/3.8.1" 127.0.0.1:8082 - - SESS=yzj3i9EnGzfs1vLy2DAlLkrYMgc6Y5IJ4SMemwiEYngDxCDbF9YlmXI5mN8vOnI0/R+6ZO+kTAf3Lb8M5VBkgVGaMXoHq5r5RFhzN1v5S1U=39.154.5.193 39.154.5.0 http://server.www.koalareading.com - [27/Jul/2019:23:59:28 +0800] "GET /users/android/user/current HTTP/1.1" 200 0.037 990 "-" "okhttp/3.8.1" 127.0.0.1:8081 - - SESS=yzj3i9EnGzfs1vLy2DAlLkrYMgc6Y5IJ4SMemwiEYnipo67uQux2uWb4/P9bfEtkYGDcLVXFSsiQDTeHkM5diohha0SFPWYxcdwXcqC2nhY=39.154.5.193 39.154.5.0 http://server.www.koalareading.com - [27/Jul/2019:23:59:28 +0800] "GET /store/web/clothing/mine?gender=1&userId=5264542 HTTP/1.1" 200 0.026 425 "https://www.koalareading.com/userspace/door?userId=5264542" "Mozilla/5.0 (Linux; Android 8.1.0; CLT-AL00 Build/HUAWEICLT-AL00; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/66.0.3359.126 MQQBrowser/6.2 TBS/044807 Mobile Safari/537.36Android/koalastudent/4100" 127.0.0.1:8089 - - zg_did=%7B%22did%22%3A%20%22167e9a414720-07fcc31a077f-6c4a693f-41a78-167e9a4147483%22%7D; SESS=yzj3i9EnGzfs1vLy2DAlLkrYMgc6Y5IJ4SMemwiEYnipo67uQux2uWb4/P9bfEtkYGDcLVXFSsiQDTeHkM5diohha0SFPWYxcdwXcqC2nhY=; zg_fba17a2135f54785aa88d686272386e4=%7B%22sid%22%3A%201564241675704%2C%22updated%22%3A%201564243167846%2C%22info%22%3A%201564241675714%2C%22superProperty%22%3A%20%22%7B%7D%22%2C%22platform%22%3A%20%22%7B%7D%22%2C%22utm%22%3A%20%22%7B%7D%22%2C%22referrerDomain%22%3A%20%22%22%2C%22zs%22%3A%200%2C%22sc%22%3A%200%2C%22firstScreen%22%3A%201564241675704%7D182.51.86.38 182.51.86.0 http://server.www.koalareading.com - [27/Jul/2019:23:59:28 +0800] "GET /tasks/android/readingTasks/student/getPracticeByType?size=10&pageNo=0&type=0 HTTP/1.1" 200 0.022 103 "-" "okhttp/3.8.1" 127.0.0.1:8083 - - SESS=yzj3i9EnGzfs1vLy2DAlLkrYMgc6Y5IJ4SMemwiEYngCb7AYWY2yagL90LZU4qbKS5UN19TYIVLLQvTYMctvCll4o6+AWJgWh/x1hHctvIs=124.238.24.129 124.238.24.0 http://server.www.koalareading.com - [27/Jul/2019:23:59:28 +0800] "GET /settings/android/serverStatus/maintainPrediction HTTP/1.1" 200 0.004 83 "-" "okhttp/3.8.1" 127.0.0.1:8088 - - SESS=yzj3i9EnGzfs1vLy2DAlLkrYMgc6Y5IJ4SMemwiEYni5PE+/l8bUGi4SpH06EWIBvGvShy89XDXS3HN2jUTfvpdUQCNk1tInmpBfe2k6QS8=
现在需要处理成这样的格式:
200 /books/web/readResource/checkPraise?articleId=12722&type=7
其中第一列是响应码,第二列是请求路径。
可以使用下面的脚本:
awk '($10 ~ /401|403|404|500||502|503|504|200/)' access.log-`date -d "yesterday" +"%Y%m%d"` | awk '{print $10,$8}' > data.log
2.获取用户请求每个状态值的请求数量
比如我现在有这样格式的日志文件:data.log200 /users/android/join/checkJoinClassRequest200 /books/android/home/3/list200 /books/android/bannerVip/list?moduleType=4200 /users/android/vip/pay/queryOrder?outTradeNo=91156372468445200 /tasks/android/advance/selfStudentTasks/queryErTestProgress200 /tasks/android/teacher-app/new/class/task/list?classId=1727&startCount=0&count=10&history=1200 /users/android/vip/getVipLevel200 /tasks/android/advance/selfStudentTasks/querySelfErTestStatus200 /users/android/user/clientid/saveOrUpdate?plat=ANDROID&clientId=27321a7a5098fd4e6560261fa41e49bf&deviceType=Other200 /users/android/user/clientid/saveOrUpdate?plat=ANDROID&clientId=CN_7f5be5371b514b65146638f62bf3775f&deviceType=Oppo200 /messages/android/student/list?type=1200 /books/android/fm/albums/editPlayCount/d3a82aae795e4349895f2cf9024d9a06/FfctJrATes8xy9NtGSoCrh94XVBaNb3d200 /store/android/signIn/1200 /users/android/user/clientid/saveOrUpdate?deviceType=Other&plat=ANDROID&clientId=1d69389b0e5c9f2c50dc2b1c2a9985fa200 /users/android/user/clientid/saveOrUpdate?plat=ANDROID&clientId=c3599c2b3ca13459d1fb4bd50a88f774&deviceType=Other200 /books/web/readResource/article?type=7&articleId=12722200 /books/web/readResource/checkPraise?articleId=12722&type=7
其中第一列是请求响应码,第二列是每个path的请求路径和参数,books是请求的项目名称
可以使用如下脚本统计:
#!/bin/shLOGPATH=data.logecho "###########################course########################################"project_6=courseecho -e $project_6 status 200 = `cat $LOGPATH |grep $project_6 |awk '($1 ~ /200/)' |wc -l`echo -e $project_6 status 401 = `cat $LOGPATH |grep $project_6 |awk '($1 ~ /401/)' |wc -l`echo -e $project_6 status 403 = `cat $LOGPATH |grep $project_6 |awk '($1 ~ /403/)' |wc -l`echo -e $project_6 status 404 = `cat $LOGPATH |grep $project_6 |awk '($1 ~ /404/)' |wc -l`echo -e $project_6 status 499 = `cat $LOGPATH |grep $project_6 |awk '($1 ~ /499/)' |wc -l`echo -e $project_6 status 500 = `cat $LOGPATH |grep $project_6 |awk '($1 ~ /500/)' |wc -l`echo -e $project_6 status 502 = `cat $LOGPATH |grep $project_6 |awk '($1 ~ /502/)' |wc -l`echo -e $project_6 status 503 = `cat $LOGPATH |grep $project_6 |awk '($1 ~ /503/)' |wc -l`echo -e $project_6 status 504 = `cat $LOGPATH |grep $project_6 |awk '($1 ~ /504/)' |wc -l`
释义:awk '($1 ~ /200/) 是找出第一列等于200的数据,wc -l统计行数(还有中说法是统计newline数)。
3.每天晚上定时的将nginx访问日志备份到指定目录:
创建脚本log_hebing.sh,内容:
#!/bin/bash/bin/cat /var/log/nginx/access.log-`date -d "yesterday" +"%Y%m%d"`>> /koala/java-app/logs_Calculation/access.log-`date -d "yesterday" +"%Y%m%d"`
设置每天晚上定时执行的任务,在crontab里面添加定时任务:
30 2 * * * /koala/shell/log_hebing.sh
每天晚上2点半定时执行log_hebing.sh脚本
4.编写查询日志的脚本
脚本内容如下:
#!/bin/bashecho "要查询的日志路径是:"$1echo "输入关键字是:"$2path=$1keyword=$2if [ "${path##*.}" == log ]; thengrep -C 20 "$keyword" $pathelif [ "${path##*.}" == gz ]; thenzcat $path | grep --binary-file=text "$keyword" -C 20elseecho "输入路径格式不合法"fi
释义:传入的参数是两个,第一个是日志的完整路径,第二个是关键字,根据传入的路径后缀,判断是当天的日志(以log结尾),还是之前的日志(已经被压缩,log,gz结尾),使用不同的查看日志命令
使用示例如下:
./grep_log.sh /koala/java-app/foundation8/logs/koala-course.2019-07-23.0.log.gz E20190723142318060100017