NZBGet - NZBget failing to start | PlexGuide.com

NZBGet NZBget failing to start

  • Stop using Chrome! Download the Brave Browser via >>> [Brave.com]
    It's a forked version of Chrome with native ad-blockers and Google's spyware stripped out! Download for Mac, Windows, Android, and Linux!
Welcome to the PlexGuide.com
Serving the Community since 2016!
Register Now

timmeh

Experienced+
Original poster
Feb 14, 2019
131
31
Hi all,

I'm currently up a mountain in Wales without access to my server, but have access to portainer etc...

Wondering if I can fix this error from here or whether it'll have to wait until I get home:

Code:
[cont-init.d] 10-adduser: exited 0.


[cont-init.d] 30-config: executing...


[cont-finish.d] executing container finish scripts...


[cont-finish.d] done.


[s6-finish] waiting for services.


[s6-finish] sending all processes the TERM signal.


[s6-finish] sending all processes the KILL signal and exiting
NZBget is failing to restart. Looks like a permissions issue. I also had to log back in through PGguard to access my apps, for the first time in days. Any suggestions of a fix through portainer to get downloads going again while I'm away will be warmly received!
 

indochild

Active
Mar 13, 2018
47
20
Do you have enough free space on your main HD? That error happened to me when I got below 3gb I believe
 
  • Like
Reactions: 1 users

Admin9705

Administrator
Project Manager
Donor
Jan 17, 2018
5,156
2,113
Very true. If out of diskspace, it's an issue. Some versions of pgblitz would shut down nzbget if under 3GB to prevent you from being locked out.
 
  • Like
Reactions: 1 user

timmeh

Experienced+
Original poster
Feb 14, 2019
131
31
Hi guys,

Home now and access to server.

Yes issue is with hard drive space; I've recently setup PGBlitz unencrypted with 20 users. All was well for a few days.
Now my 2TB hard drive is full... and I can find the directory which is causing issues. Nothing found in /mnt/move/ or /mnt/downloads, 3000 queued files in /mnt/nzb but not taking up much space.

I ran sudo df -h and got the following:
Code:
Filesystem      Size  Used Avail Use% Mounted on
udev             16G     0   16G   0% /dev
tmpfs           3.2G  1.9M  3.2G   1% /run
/dev/md2        1.8T  1.7T  4.7G 100% /
tmpfs            16G     0   16G   0% /dev/shm
tmpfs           5.0M     0  5.0M   0% /run/lock
tmpfs            16G     0   16G   0% /sys/fs/cgroup
/dev/md1        488M  114M  349M  25% /boot
tmpfs           3.2G     0  3.2G   0% /run/user/0
gdrive:         1.0P  332G  1.0P   1% /mnt/gdrive
tdrive:         1.0P     0  1.0P   0% /mnt/tdrive
pgunion         2.1P  2.1T  2.1P   1% /mnt/unionfs
I ran sudo du -h --max-depth=1 and got the following:
Code:
49M    ./.cache
8.0K    ./.gnupg
16K    ./.local
4.0K    ./.config
168K    ./.ansible
4.0K    ./.ssh
49M    .
Any troubleshooting tips warmly welcomed so I can find what's taking so much space.

Thanks in advance,
TIMMEH!
 

timmeh

Experienced+
Original poster
Feb 14, 2019
131
31
OK, so NCDU has revealed 1.6TB stuck in /mnt/downloads/nzbget/movies/ which I must have missed on my manual search.
Any reason why these are stuck here at not getting moved across?
 

Admin9705

Administrator
Project Manager
Donor
Jan 17, 2018
5,156
2,113
It’s sonarr and radar losing track of them. I’m working PG9 to better handle this
 
  • Like
Reactions: 1 user

timmeh

Experienced+
Original poster
Feb 14, 2019
131
31
Hmmm... well, Radarr was, for some unknown reason, not moving across, so I've deleted the directory and subs, restarted the server and have started those downloads again... ?
 
  • Like
Reactions: 1 user

PlexPlex

Experienced
Staff
Aug 8, 2019
53
8
I wish myself they automated some of this better. Can’t complain with what’s free.
 
  • Like
Reactions: 1 user

timmeh

Experienced+
Original poster
Feb 14, 2019
131
31
Seems a shame that those folders need constant monitoring. Let's hope pg9 helps solve some of these issues.
On the plus side, at least blacklisting helps make the re-downloading process easier. Just a shame to have to have to go through the process.
 

timmeh

Experienced+
Original poster
Feb 14, 2019
131
31
Is there a script or function that could delete files in that folder after 24 hours? That would save HD space ever filling up. Would the effect of that be to automatically blacklist that file and trigger a re-download?
 
  • Like
Reactions: 1 user

Admin9705

Administrator
Project Manager
Donor
Jan 17, 2018
5,156
2,113
there is a script that deletes junk files that in the completed, but not incomplete. again, pg9 will resolve several of these issues.
 
  • Like
Reactions: 1 user

timmeh

Experienced+
Original poster
Feb 14, 2019
131
31
there is a script that deletes junk files that in the completed, but not incomplete. again, pg9 will resolve several of these issues.
Awesome work dude. Roll on PG9! ?
 
  • Like
Reactions: 1 user

Admin9705

Administrator
Project Manager
Donor
Jan 17, 2018
5,156
2,113
Awesome work dude. Roll on PG9! ?
The only problem I’m concerned with about with deleting incomplete was let’s say you have an uploading backlog. Items that have been pending will all be deleted. I could give the user the choice but the pg9 setup should resolve some permission issues which is also a cause factor at times.
 
  • Like
Reactions: 1 users

Recommend NewsGroups

      Up To a 58% Discount!

Trending