Is it necessary to load all of your data into memory? Maybe the analysis you want to do only requires 3 fields of a record instead of all 50 fields for example. Consider creating a temporary dataset with a hash to lessen the memory you will require. Maybe your data is unnecessarily large, ie you're using bigints when you need only 3 sig figs, date and time when you only need the date, varchar(100) when you only need the first 5 letters of the last name. Try truncating data to allow a less memory intensive initial processing. then you can go back using your hash and look at finer details, like the time after the date has been sorted. So you would load records in a block, dump the portion of the data that is unneeded, and move on.
It would be helpful if you gave us more details of what your data looks like, what you are trying to do with it, etc. Or at least a facsimile of it if security/privacy keeps you from giving us the real thing. Sorry this is so general, working with what I've got.