Xml listener and loading parsing xml

Topics: Logging Application Block
Jun 25, 2010 at 3:14 PM

Hi All

Using the xml listener creates the xml file without a root node ( makes sense as adding to the xml just intales a file write appending )

But to load and parse the xml entalis something like below. Now the problem with this is that the file over time grows and can get to 100meg after say 50k writes. Loading that size of file into memory to parse is a problem. Anyone got any thoughts on how to parse the xml to minimise memory overhead ?

      if (System.IO.File.Exists(aFileName) == true)
      {
        FileInfo f = new FileInfo(fileName);

        var xmlString = new System.Text.StringBuilder( (int) f.Length ); // this often fails with out of memory exception
        using (FileStream fs = new FileStream(aFileName, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
        {
          using (StreamReader reader = new StreamReader(fs))
          {
            xmlString.Append( reader.ReadToEnd() );
          }
        }

        xml = XDocument.Parse("<Root>" + xmlString.ToString() + "</Root>");
      }

Jun 28, 2010 at 1:31 AM

Your question is out of entlib scope, I suggest posting it in other .NET forums. 

However, do you consider working with multiple files rather than dealing with a single large file?  If yes, why not just use the Rolling Flat File Trace Listener so that you'll get several files instead of a single big file?  Use a text formatter and modify its Template property so that it mimics the format generated by an xml trace listener.

 

Sarah Urmeneta
Global Technology and Solutions
Avanade, Inc.
entlib.support@avanade.com