![]() ![]() (It seems until last week it worked this way. In that way it is pretty self regulating - you'll run into memory issues before you run afoul of Code42. In practice I've gotten away with 1GB per 2TB, or less. Code42's guideline (I think) is 1GB of memory for every TB of backup source. The two DS412+ machines that I know work, are not experiencing this but their backup sizes are smaller and they are running fewer services than you. The machine is paging, which is always bad, and your file system buffers/cache show that you are likely working the drives harder than needed. Say, right now I see mydefrag process using 1700 MB allocated set with no problem. CrashPlan is Java based and uses quite a bit of memory and will get crashy if you don't allocate enough. Your CrashPlan memory requirements are barely being met. This is an unusual and very helpful feature for managing data efficiently. In the interests of limiting memory usage by the app, Crashplan recommend that you do not increase the number of versions retained from the default setting. I would expect that Windows would swap out other processes in order to allocate memory for Java, and since Java does not use it anyway, would swap Java's unused memory to pagefile. There are generous retention settings which you can adjust. I use Windows 7 pro 32-bit (not 64-bit), with 32 GB memory installed (3 GB accessible, and pagefile on a large RAM disk in inaccessible memory). (With -Xmx1000M it does start, but crashes later on.) In the Code42 agent, memory allocation is dynamically set to use 25 of the physical memory on the device. However, Java VM cannot start with -Xmx1500M, I guess because it fails to allocate this memory. The Code42 agent only needs about 600 MB of memory per 1 TB of storage (or per 1 million files), but this allocation is intended to account for growth in your file selection. After start (with -Xmx1000M), Task Manager shows its memory usage about 100 MB in Working Set, Private Set, and Assigned Memory. Usually (and at least when starting) it needs very little memory, but with anything lower than -Xmx1500M it crashes at peak use. The CrashPlan app only needs about 600 MB of memory per 1 TB of storage (or per 1 million files), but this allocation is intended to account for growth in your file selection. I use a memory hog program (CrashPlan) written in Java, which sometimes, at peak use, needs some 1.5 GB of memory (this is controlled by the parameter -Xmx1500M). How can I start a Java program with large max heap? a slimmed down memory usage so I can actually run apps on my ds410j again. ![]()
0 Comments
Leave a Reply. |