I have an app that needs to handle very large strings between a SQL Server database and .NET code. I have a LINQ query that generates the strings when saving them to the database, but when trying to create the strings from the database, the app crashes with an OutOfMemoryException because of the size of the strings.
Do I have to do something to make the LINQ generated code avoid that? Using some kind of compression might be an option, but would like to avoid that for performance reasons.
What do you call ‘very large’? And what is the string? CLOB? BLOB? xml?
I suspect you should be using things like
ExecuteReader(), which (viaIDataReader) exposes methods for reading such columns in chunks:Obviously with xml you might want an
XmlReaderviacmd.ExecuteXmlReader().Updated re LINQ comment (now deleted):
To use
IDataReaderdirectly from LINQ-to-SQL, I expect the closest you can get isctx.GetCommand(), passing it a query. You would then useExecuteReaderorExecuteXmlReaderas above. I don’t know much about EF…If you give an example of the type of query that is failing, there might be some tricks possible – for example, if you are filtering or selecting subsets of the xml, there are things you can do in SQL/XML – perhaps in a UDF called via LINQ-to-SQL.