User Tools

Site Tools


common_cryosparc

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
common_cryosparc [2022/04/22 07:46] bonnefoncommon_cryosparc [2023/11/01 20:18] (current) – external edit 127.0.0.1
Line 1: Line 1:
-We deployed cryoSPARC on cbi-gpu-03 and we are opening the access to everyone. As cryoSPARC is not compatible with the IGBMC authentication system, I have to create the user accounts by hand. To have an account, you have to [[balletn@igbmc.fr|send me an email to ask me]], and you will receive the instance URL + your generated password (which is not stored on my side, so keep it safe).+We deployed cryoSPARC on cbi-gpu-03 & cbi-gpu-04 and we are opening the access to everyone. As cryoSPARC is not compatible with the IGBMC authentication system, I have to create the user accounts by hand. To have an account, you have to [[cbi_admins@igbmc.fr|send an email to CBI-Admins to ask]], and you will receive the instance URL + your generated password (which is not stored on my side, so keep it safe).
  
 Moreover, to follow the new team/project based storage system, the cryoSPARC service is separated into multiple instances, one for each team. Installing it this way prevents confidentiality and security issues. Moreover, to follow the new team/project based storage system, the cryoSPARC service is separated into multiple instances, one for each team. Installing it this way prevents confidentiality and security issues.
Line 13: Line 13:
 [[https://cavarelli.cryosparc.cbi.igbmc.fr|https://cavarelli.cryosparc.cbi.igbmc.fr]] [[https://cavarelli.cryosparc.cbi.igbmc.fr|https://cavarelli.cryosparc.cbi.igbmc.fr]]
  
-To check your position in the queue, log to cbi-gpu-03 with ssh and run the squeue command+To check your position in the queue, log to cbi-gpu-03 or cbi-gpu-04 with ssh and run the squeue command
  
 +----
  
 **2022-04-07** **2022-04-07**
Line 28: Line 29:
 I worked with the IT Services to deploy dedicated domain names for each team. I worked with the IT Services to deploy dedicated domain names for each team.
 You can access your cryoSPARC instance with this url: You can access your cryoSPARC instance with this url:
 +
 [[https://cavarelli.cryosparc.cbi.igbmc.fr|https://cavarelli.cryosparc.cbi.igbmc.fr]] [[https://cavarelli.cryosparc.cbi.igbmc.fr|https://cavarelli.cryosparc.cbi.igbmc.fr]]
 +
 It is accessible from inside and outside the lab network. It is accessible from inside and outside the lab network.
 +
 +----
 +
 +**2022-04-21**
 +
 +In order to bring the Slurm cluster to a fully featured state I am deploying a shared home system.
 +This will bring you two features:
 +- you can use your home inside your Slurm jobs as files will be share between the cluster nodes (for custom scripts for example)
 +- you will have a place to put your scripts, which was not the case before because of the "by project" structure
 +
 +There is three little details:
 +  * Your home directory on the servers will now be: **/mnt/storage/home/<username>**
 +  * Some of you already have some data inside your homes on some of the servers. Every data will still be available inside **/home/<user>**
 +  * A quota of 30GB will be put in place for every home.
 +
 +To be able to run jobs through slurm, you first have to connect to one of the servers via **ssh** (let's say cbi-compute-01). This will create a home directory on the storage (**/mnt/storage/home/<username>**). Then if you want to, you can get back your **.bashrc**, **.bash_aliases**, or any configuration from **/home/<username>**
 +
 +The shared home system did not solve the recent cryoSPARC issue (//permission denied on /home/<user>//). cryoSPARC is visibly not following Linux home directory path standards (one more cryoSPARC issue to the list).
 +I will try to make a workaround for this during the day.
 +
 +The cryoSPARC issue should be fixed.
 +
 +Also, now that our servers are deployed under Slurm, you should know that (when connected to one of the CBI servers) with the command:
 +<code>sinfo -Nl</code>
 +You can see the node list and if they are not available, the reason why. It allows you to follow when I am debugging a specific node.
 +
 +
  
common_cryosparc.1650613562.txt.gz · Last modified: (external edit)