I'm trying to diagnose a memory leak on an Azure Web App.
I use the Diagnose and Solve Problems > Diagnostic Tools > Collect Memory Dump (tool referenced here).
This collects a dmp file and generates an analysis report. I can see the threads and other information in the Crash Hang Analysis, but the DotNetMemoryAnaysis always fails with error
Type: System.OutOfMemoryException
Message: Exception of type 'System.OutOfMemoryException' was thrown.
Stack Trace:
DebugDiag.DotNet.NetDbgObj.d__73.MoveNext()
System.Linq.Enumerable.WhereSelectEnumerableIterator`2.MoveNext()
System.Linq.Enumerable.WhereEnumerableIterator`1.MoveNext()
System.Linq.Lookup`2.Create[TSource](IEnumerable`1 source, Func`2 keySelector, Func`2 elementSelector, IEqualityComparer`1 comparer)
System.Linq.GroupedEnumerable`3.GetEnumerator()
System.Linq.Enumerable.WhereSelectEnumerableIterator`2.MoveNext()
System.Linq.Buffer`1..ctor(IEnumerable`1 source)
System.Linq.OrderedEnumerable`1.d__1.MoveNext()
System.Linq.Enumerable.d__25`1.MoveNext()
System.Linq.Enumerable.FirstOrDefault[TSource](IEnumerable`1 source, Func`2 predicate)
DebugDiag.AnalysisRules.DotNetMemoryAnalysis.GCRootWalker.ShowRoots(NetScriptManager manager, NetDbgObj debugger, NetProgress progress, IEnumerable`1 top40Query) in C:\src\DebugDiag\Development\src\DebugDiag.AnalysisRules\DotNetMemoryAnalysis.cs:line 1875
DebugDiag.AnalysisRules.DotNetMemoryAnalysis.DoDotNetMemoryAnalysis() in C:\src\DebugDiag\Development\src\DebugDiag.AnalysisRules\DotNetMemoryAnalysis.cs:line 222
DebugDiag.AnalysisRules.DotNetMemoryAnalysis.RunAnalysisRule(NetScriptManager manager, NetProgress progress) in C:\src\DebugDiag\Development\src\DebugDiag.AnalysisRules\DotNetMemoryAnalysis.cs:line 182
DebugDiag.DotNet.NetAnalyzer.RunAnalysisRulesInternal(DumpFileType bitness, NetProgress progress, String symbolPath, String imagePath, String reportFileFullPath, Boolean twoTabs, AnalysisModes analysisMode)
I tried analyzing the file with the dotnet-dump
cli tool, but it errors for any analysis action with
SOS does not support the current target architecture 0x0000014c
.
Opening the dmp in Visual Studio also does not appear to offer any analysis options, just debugging.
Is there a way I can run analysis for the dmp from another machine? Is there a different way I should collect the dump?
The analysis tool used by azure web apps on windows can be downloaded at https://www.microsoft.com/en-us/download/confirmation.aspx?id=58210
This did not solve my problem though. I still get an out of memory exception.
I monitored the system memory usage, and it never got close to topping out.
Increased GCRootTimeout in Program Files\DebugDiag\AnalysisRules\DebugDiag.AnalysisRules.dll.config
.
I also set gcAllowVeryLargeObjects in every config file I could find.
Thanks for sharing your dump file @farlee2121. I opened your dump file using WinDbg and kicked off !analyze -v
. This will pull available symbols to your local cache.
SYMSRV: BYINDEX: 0x1D
https://msdl.microsoft.com/download/symbols
SOS_x86_x86_4.8.4180.00.dll
5E7D1ED77b0000
SYMSRV: PATH: C:\debug\sym\SOS_x86_x86_4.8.4180.00.dll\5E7D1ED77b0000\SOS_x86_x86_4.8.4180.00.dll
SYMSRV: RESULT: 0x00000000
DBGHELP: C:\debug\sym\SOS_x86_x86_4.8.4180.00.dll\5E7D1ED77b0000\SOS_x86_x86_4.8.4180.00.dll - OK
When trying to look at the heap, !dumpheap -stat
, resulted in...
Object <exec cmd="!ListNearObj /d 5cdcf038">5cdcf038</exec> has an invalid method table.
0:000> !ListNearObj /d 5cdcf038
Before: 5cdcf014 36 (0x24) System.Collections.Hashtable+HashtableEnumerator
After: couldn't find any object between 0x5cdcf038 and 0x5cdd00cc
Heap local consistency not confirmed.
...which could indicate GC was possibly running at the time the dump was being collected. There two options we can do at this point. One is use mex extension and run !mex.dumpheap2
or use PerfView to analyze the heap.
Mex showed a fair amount of Automapper objects
1,014,591 20,291,820 System.Linq.Expressions.FullConditionalExpression
2,873 24,391,820 System.Char[]
1,541,328 24,661,248 System.Linq.Expressions.AssignBinaryExpression
1,680,035 26,880,560 AutoMapper.Mappers.ConvertMapper+<>c__DisplayClass1_1
1,571,447 31,428,940 System.Linq.Expressions.LogicalBinaryExpression
1,680,036 33,600,720 System.Lazy<System.Linq.Expressions.LambdaExpression>
1,194,017 33,941,972 System.Linq.Expressions.Expression[]
3,161,253 37,935,036 System.Linq.Expressions.ConstantExpression
9,941 39,286,832 System.Collections.Generic.Dictionary+Entry<AutoMapper.TypePair,System.Lazy<System.Linq.Expressions.LambdaExpression>>[]
844,692 39,384,092 System.Reflection.MemberInfo[]
665,549 47,919,528 AutoMapper.PropertyMap
1,680,036 53,761,152 System.Func<System.Linq.Expressions.LambdaExpression>
253,050 58,765,632 System.Int32[]
339,941 61,674,100 System.String
25,206 70,703,012 System.Byte[]
Total 37,374,335 Object(s), Total Size: 1.03 GB, Free Objects 812(352.21 KB)
If you use Perfview though to open the dump and Dump GC Heap, we can get a better picture
Name Inc % Inc
LIB <<System.Core!Linq.Expressions.Expression>>> 21.8 214,783,296
+ AutoMapper!AutoMapper.TypeMap 21.8 214,783,296
+ LIB <<mscorlib!Dictionary>> 21.8 214,783,296
|+ AutoMapper!AutoMapper.MapperConfiguration 21.8 214,783,296
||+ AutoMapper!AutoMapper.Mapper 21.8 214,783,296
|||+ Fourstarzz.Accessors!Fourstarzz.Accessors.EntityFramework.DtoMapper 21.8 214,783,296
||||+ Fourstarzz.Accessors!Fourstarzz.Accessors.IntegrationAccessInfoAccessor 21.8 214,783,296
|||||+ Fourstarzz.Managers!Fourstarzz.Managers.Identity.AccountConnectionManager 10.9 108,058,320
||||||+ Fourstarzz.Managers.Adapters!Fourstarzz.Managers.Adapters.Identity.OkanjoRegistrationHandler 10.9 108,058,320
|||||| + Fourstarzz.Managers.Adapters!Fourstarzz.Managers.Adapters.Identity.CompositeIdentityEventHandler 10.9 108,058,320
|||||| + Fourstarzz.Managers!Fourstarzz.Managers.Identity.UserIdentityManager 10.9 108,058,320
|||||| + Fourstarzz.Clients.Website!Fourstarzz.Clients.Website.IdentityWrapper 10.9 108,058,320
|||||| |+ LIB <<System!Stack<Object>>> 10.9 108,058,320
|||||| | + Autofac!Autofac.Core.Disposer 10.9 108,058,320
|||||| | + Autofac!Autofac.Core.Lifetime.LifetimeScope 10.9 108,058,320
|||||| | |+ LIB <<mscorlib!Func>> 10.9 108,058,320
|||||| | | + Microsoft.AspNet.Identity.Owin!Owin.AppBuilderExtensions+<>c__DisplayClass1 10.9 108,058,320
|||||| | | |+ LIB <<mscorlib!Func,Microsoft.Owin.IOwinContext,Fourstarzz.Shared.FourstarzzIdentity.FourstarzzUserManager>>> 10.9 108,058,320
|||||| | | ||+ Microsoft.AspNet.Identity.Owin!Microsoft.AspNet.Identity.Owin.IdentityFactoryProvider 10.9 108,058,320
|||||| | | || + Microsoft.AspNet.Identity.Owin!Microsoft.AspNet.Identity.Owin.IdentityFactoryOptions 10.9 108,058,320
|||||| | | || + Microsoft.AspNet.Identity.Owin!Microsoft.AspNet.Identity.Owin.IdentityFactoryMiddleware> 10.9 108,058,320
|||||| | | || + Microsoft.AspNet.Identity.Owin!Microsoft.AspNet.Identity.Owin.IdentityFactoryMiddleware> 10.9 108,058,320
|||||| | | || + Microsoft.Owin!Microsoft.Owin.Infrastructure.OwinMiddlewareTransition 10.9 108,058,320
|||||| | | || + LIB <<Microsoft.Owin.Host.SystemWeb!Microsoft.Owin.Host.SystemWeb.IntegratedPipeline.IntegratedPipelineBlueprint>> 10.9 108,058,320
|||||| | | || + [static var Microsoft.Owin.Host.SystemWeb.OwinHttpModule._blueprint] 10.9 108,058,320
|||||| | | || + [static vars] 10.9 108,058,320
Here are some things to proceed further. First, change your web app from x86 to x64 on the Application Settings blade, that will at least give you some more breathing room. Upon restarting your app, collect a memory dump from the Diagnose and sovle problems blade to get a baseline. Then configure AutoHeal to collect a dump file when memory reaches an upper threshold to get around OOM you're running into. Furthermore, Perfview will allow you compare to dump heaps so you can see which objects are growing in allocation; check Starting an Analysis help in Perfview for more info.
In my personal experience, AutoMapper can cause performance issues if not properly configured. I once was creating a Mapper each time I processed data when it wasn't necessary. It also looks like you adding an Mapper to your AutoFac IoC container and that mapper has reference to an EventHandler. Event handlers can pin objects to the heap, preventing the GC to collect it.
User contributions licensed under CC BY-SA 3.0