Welcome to the MapMan Family of Software Forum

Please do not hesitate to register and post your question.

Don't forget to subscribe to your posted message so you get notified on updates.
Every question you post will help others and or enhance the software!

Post a question,   post a bug!

Welcome to the MapMen Family of Software Forum Welcome to the MapMen Family of Software Forum

Using MapMan

Memory in windows

Memory in windows
Answer
3/24/11 8:35 PM
Hi,
I uploaded 36 .CEL files into Robin and was carrying out the quality check step when I encountered a problem with memory. I got the following error message:


Loading required package: Biobase Welcome to Bioconductor Vignettes contain introductory material. To view, type 'openVignette()'. To cite Bioconductor, see 'citation("Biobase")' and for packages 'citation(pkgname)'. Loading required package: Biostrings Loading required package: IRanges Attaching package: 'IRanges' The following object(s) are masked from package:base : cbind, duplicated, order, pmax, pmax.int, pmin, pmin.int, rank, rbind, rep.int, sapply, sort, table, unique Error: cannot allocate vector of size 372.1 Mb In addition: Warning messages: 1: In dimnames(x) - dn : Reached total allocation of 1535Mb: see help(memory.size) 2: In dimnames(x) - dn : Reached total allocation of 1535Mb: see help(memory.size) 3: In dimnames(x) - dn : Reached total allocation of 1535Mb: see help(memory.size) 4: In dimnames(x) - dn : Reached total allocation of 1535Mb: see help(memory.size) Execution halted


I repeated without the quality step, but encountered the problem again in the statistical identification of differentially expressed genes step. IIs there a way of expanding the amount of memory required by Robin?

Regards,
Martin

RE: Memory in windows
Answer
3/24/11 8:45 PM as a reply to Martin O'Donoghue.
Hi Martin,

the error message indicates that the R engine embedded in Robin ran out of memory when trying to
allocate more than ~1.5 GB. How much memory does your machine have? Are you sure that there
is more available? Which OS are you using? As far as i know Windows XP 32bit can only use about
3GB of memory for applications. Basically, the R embedded in Robin should have access to all free
memory on your machine - so the fact that it crashed might indicate that there was not enough free
memory available.

best greetings,
Marc

RE: Memory in windows
Answer
3/25/11 3:59 PM as a reply to Marc Lohse.
Hi Marc,
My computer has 3.46GB of RAM. It's a 32bit machine using windows XP as the OS. Is it possible to expand the amount of memory R is using?

Thanks,
Martin

RE: Memory in windows
Answer
3/25/11 4:56 PM as a reply to Martin O'Donoghue.
Hi Martin,

i researched your problem a bit and it might be possible to fix that by running the main analysis
script that is saved in the source directory of the Robin project from the command line. To do so
click on "start"->"run" in your windows start menu and type "cmd" to open a command console.

Use the "cd" command to navigate to the directory in which Robin was installed and then navigate to
the R/R-2.9.1/bin directory. Now type

R.exe --max-mem-size=4000M --max-vsize=4000M --file PATH_TO_THE_MAIN_ANALYSIS_SCRIPT

this should start R and run the script.

I hope this helps.

Best regards,
Marc

RE: Memory in windows
Answer
3/29/11 9:57 AM as a reply to Marc Lohse.
Hi Marc,
In your script "R.exe --max-mem-size=4000M --max-vsize=4000M --file PATH_TO_THE_MAIN_ANALYSIS_SCRIPT" what does "--file PATH_TO_THE_MAIN_ANALYSIS_SCRIPT" refer to? I wrote the code in DOS, but got

--file unknown
PATH_TO_THE_MAIN_ANALYSIS_SCRIPT ignored

Was I supposed to put a particular file path in there? If so which one?

DOS then said: "4000M too large. Taken as 2047M", but when I ran Robin again it gave the same error message with the same size memory i.e. 1535Mb.

Thanks,
Martin

RE: Memory in windows
Answer
3/29/11 10:22 AM as a reply to Martin O'Donoghue.
Hi Martin,

sorry that was a misunderstanding - i guess i should have explained it more clearly:
In the command line that starts R from the command console you have to replace
the "PATH_TO_THE_MAIN_ANALYSIS_SCRIPT" part with the actual path to the
main analysis script in your robin project directory. So, for example, if you ran robin
and created a project directory under "C:/mydata/myexpressionanalysis/robin_projectX"
the path to the main analysis script would be "C:/mydata/myexpressionanalysis/robin_projectX/source/robin_projectX_main_analysis.R"

So please retry running R from the command line using the correct path. The message you got
when running R from the cmd console indicates that it can use up to 2047M to allocate
data - so with some luck you might be able to run your analysis. Unfortunately, running
R with custom memory allocation settings right now only works from the command
line. When using Robin, R will always run with standard settings. I will include smarter
handling of this in the next release of Robin.

Hope this helps - best greetings,
Marc

RE: Memory in windows
Answer
3/29/11 5:39 PM as a reply to Marc Lohse.
Hi Mark,
I seem to be having a problem with the code. I've typed:

R.exe --max-mem-size=2047M --max-vsize=2047M C:/Documents and Settings/MODONOGHUE/Desktop/Rice_data/source/Rice_data_main_analysis.R

but then I get the following errors:

ARGUMENT 'C:\Documents' __ignored__

ARGUMENT 'and' __ignored__

ARGUMENT 'Settings/MODONOGHUE/Desktop/Rice_data/source/Rice_data_main_analysis.R' __ignored__

I've tried variations like putting --file before the file path, but dos doesn't seem to understand it and I've moved the folder around and I've tried '\' as well as '/' but it doesn't seem to change anything. I've also tried putting the file path in '' (i.e. 'C:/Documents....')

Can you see what I am doing wrong?

Thanks,
Martin

RE: Memory in windows
Answer
3/29/11 5:46 PM as a reply to Martin O'Donoghue.
Hi Martin
You have to prepend --file= to the path to your Script file to
Make R execute the script.

Best Greetings
Marc

RE: Memory in windows
Answer
3/29/11 6:09 PM as a reply to Marc Lohse.
Hi Marc,
That seemed to have worked. I had to move the folder to the C drive as dos had trouble with the spaces in "Documents and Settings", but it ran the code. It has encountered an error however:

....
....
> # now read in the data and normalize it
> data <- ReadAffy(filenames=PARAM_INPUTFILES)
Error: cannot allocate vector of size 372.1 Mb
Execution has halted.

Does this mean that there simply isn't enough memory on this computer to analyse a dataset this big?

Thanks,
Martin

RE: Memory in windows
Answer
3/30/11 8:53 AM as a reply to Martin O'Donoghue.
Hi Martin,

the problem regarding spaces in path names can be solved by simply enclosing the
path in double quotes.

Regarding your memory problem, the situation seems to be a bit strange since R suddenly
fails to allocate a relatively small vector (372,1 M). Maybe try running R with just the --max-vsize=2047M
argument, but this is just a wild guess. If this also fails it might really be that your dataset is
too large to be analyzed on a 4GB WinXP machine.

May i ask which chip platform you are using?

Bests,
Marc

RE: Memory in windows
Answer
3/31/11 10:03 AM as a reply to Marc Lohse.
Hi Marc,
That didn't seem to work either. Thanks for your help though. I have used .CEL from the affymetrix platform. In total there are 36 chips and so 36 .CEL files.

Regards,
Martin

RE: Memory in windows
Answer
3/31/11 10:16 AM as a reply to Martin O'Donoghue.
Hi Martin,

maybe you can try just loading a subset of your CEL files into Robin and computing a subset of the
contrasts you want to extract. Are the chips you are using ATH1 Affy chips or a platform
with more probe sets?

Alternatively, you could copy the R script file that Robin generated (plus the cel files) to a more powerful
computer (running a 64bit OS and 64bit R) and execute it there. This would, though, require you to adjust
all the paths in the script file manually.

bests,
Marc

RE: Memory in windows
Answer
3/31/11 11:17 AM as a reply to Marc Lohse.
Hi Marc,
I have split the two datasets into two. Seems to be my best option. Reducing the number of .CEL files to a subset of 18 has worked. I think that should be fine for the biological questions I am asking.

The affymetrix chips are rice chips. I'm not sure if this has more probe sets than the ATH1 chips.

Thanks for all your help.

All the best,
Martin