Saturday, October 10, 2015

linux - Cron Job not running ( too many open files )



Sorry I may be a bit of a newbie here and I have never really ran any cron jobs.



Anyways, I am running an Arch Linux server which has cronie running on it and I have setup this cron job:



10 * * * * sh /home/cron/CronScripts/svnbackup.sh


Which should fire the script 'svnbackup.sh'. However, when I tail -f the logs it shows the following errors:





Dec 16 12:00:01 Aramis /usr/sbin/crond[536]: (root) CAN'T OPEN (/etc/crontab): Too many open files
Dec 16 12:00:01 Aramis /usr/sbin/crond[536]: (CRON) OPENDIR FAILED (/etc/cron.d): Too many open files
Dec 16 12:00:01 Aramis /usr/sbin/crond[536]: (CRON) OPENDIR FAILED (/var/spool/cron): Too many open files




If I run the script directly it runs fine and does exactly what it was supposed to do, so what is the reason that the cron won't run it?



Any help would be great.




Thanks!


Answer



This isn't a problem with your script, it's a problem with your system. As you might guess from the error, there seem to be too many files open.



Check /proc/sys/fs/file-max to see what your limit currently is. If it's too low, you can reset it using sysctl. Also, add a line to /etc/sysctl.conf to set it at your next boot:



file-max=65536



(for example)


No comments:

Post a Comment

linux - How to SSH to ec2 instance in VPC private subnet via NAT server

I have created a VPC in aws with a public subnet and a private subnet. The private subnet does not have direct access to external network. S...