domenica 9 ottobre 2016

My bad long [almost solved] adventures with WD MyCloud & Hikvision DVR

Hikvision DVR doesn't see WD MyCloud as a compatible NAS just because of the WD BAD NFS support. Here's how to solve, useful too if you need to enable NFS for other reasons.


cd /etc
cp exports exports.orig
chmod 400 exports.orig
nano exports

You find    /nfs *( bunch of options here... )
Just add your folder like    /nfs/Public    or other folders to other lines, copy-pasting the options you find for /nfs. Leave the /nfs line safe, don't modify it: I experienced problems here and at least the /nfs share should be working, so I decided to only leave the /nfs line due to several tests and combinations, all failed because my custom shared folder appeared and disappeared. This due to an Hikvision bug that requires the folder to have a limited quota and WD MyCloud doesn't support quotas. We can bypass this limitation via any external linux machine: we mount MyCloud as FTP-to-filesystem, mount is as a file system, add it in /etc/fstab and export it via NFS.

If you still want to enable NFS on the WD MyCloud for future quota fix (I will still keep on trying, don't worry...), fix portmapper & rcpbind or NFS won't work on restart

update-rc.d rpcbind enable && sudo update-rc.d nfs-common enable

Manually restart everything you need

service rpcbind restart
/etc/init.d/nfs-kernel-server restart

Please notice that you could need to repeat the procedure on firmware update. It's a good practice to save a template e.g. /shares/exports.tpl to have your exports file handy to copy-paste in a folder which shouldn't be overwritten by new firmwares.

Now you can press "Search" on Hikvision DVR putting the IP of your NAS and he will reply with "/nfs/Public". Cheers to me!

On the "Manage Storage" tab, it will say "unformatted". You can safely press FORMAT, since it won't actually format anything in the strict meaning, it will just write new files & folders needed by the Hikvision system to properly use the NAS. UPDATE - after a while it will recognize that quota is missing and return an error. I tried making an .img file with dd, but the ubuntu under MyCloud doesn't provide a lot of supporting tools unless you don't manually compile everything... too many hours for my poor time budget.

WORKAROUND
As described, the only workaround is via a 3rd machine. I have a server providing AVI streaming around my LAN, so I decided to use that one. A shared Hostmonster plan via ssh could work as well, if you trust to ssh-tunnelize everything for security reasons (instructions provided inside this blog).

Add e.g. an hikvision user to your remote FTP users, assign a folder and a quota. I did it via CPanel on Hostmonster. Put a whatever dummy file in your FTP folder, to check later if it was properly mounted on the remote machine / Hikvision. My user would be hikvision@ftp.yourdomain.com and the password "FTPpassw0rd".

On your 3rd party operative system, stop iptables, set SElinux from "enforcing" to "disable", add a specific user and discover uid & gid assigned by the operative system:
useradd -m hikvision -p 'Passw0rd'
id
service iptables stop
chkconfig iptables off
nano /etc/sysconfig/selinux

Install curlftpfs, assign the proper permissions, mount the partition and check the dummy file:
mkdir /ftpmount
chown hikvision.hikvision /ftpmount -Rf

yum install fuse* libcurl* glib* glibc.i686 file-libs file-devel file-static curl curlftpfs -y

curlftpfs -o allow_other,nonempty,ftpfs_debug=5 -v hikvision%40ftp.yourdomain.com:FTPpassw0rd@ftp.yourdomain.com /ftpmount



ls -la /ftpmount

Now you can create a file with protected credentials, so you don't have to specify them via command line in clear text (please remember to clean ~/.bash_history as soon as you complete all settings):
nano ~/.netrc

INSIDE THE FILE PUT:
machine ftp.yourdomain.com
login hikvision@ftp.yourdomain.com

password FTPpassw0rd

Then:
chmod 600 ~/.netrc
curlftpfs ftp.yourdomain.com /ftpmount -o uid=500 -o gid=500 -o allow_other
ls -la /ftpmount

You should be able to see the dummy file again. Then it's time to shape NFS:
yum install quota -y
yum install nfs-utils nfs-utils-lib
nano /etc/exports

INSIDE THE FILE PUT:
/ftpmount 192.168.1.0/24(insecure,rw,fsid=1,anonuid=500,anonuid=500,sync,no_root_squash,no_subtree_check)

nano /etc/fstab

INSIDE THE FILE PUT (I chose fuse filesystem, but you can use nfs or whatelse):
curlftpfs#fto.yourdomain.com/ftpmount fuse rw,noauto,user,uid=500,gid=500,allow_other,_netdev 0 0

Then:
chkconfig nfs on
service rpcbind start
service nfs start
exportfs -a

Then let the quota create the files:
quotacheck -cug
quotacheck -avug

From an external machine, you can check if NFS is properly active and configured:
showmount -e 192.168.1.x

Go back on Hikvision. On the "Manage Storage" tab, it will say "unformatted". You can safely press FORMAT, since it won't actually format anything in the strict meaning, it will just write new files & folders needed by the Hikvision. You should now see the "file info.bin". A folder "datadir0" will be created at the first mp4 file writing, containing mp4 videos, indexes and some logfiles.