Redis occasional hangs

0

We have a single standalone Redis (2.8.2104) for Windows node running on a server.

Two other server are communicating with this instance.

We use it with SignalR and for caching. The dump has a size of about 700MB

From time to time we have hangs for 1-3 minutes. After this it recovers by itself.

The error seems to only occur when there is some traffic on our page.

In this time we get the exception you can see below

StackExchange.Redis.RedisConnectionException: SocketFailure on EVAL at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at Microsoft.Owin.Cors.CorsMiddleware.d__0.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at Microsoft.Owin.Mapping.MapMiddleware.d__0.MoveNext() --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) at

When I search through the Redis log i can ocasssionally find these errors:

[33144] 18 Feb 18:18:44.843 #

=== REDIS BUG REPORT START: Cut & paste starting from here === [33144] 18 Feb 18:18:44.844 # Out Of Memory allocating 308457 bytes. [33144] 18 Feb 18:18:44.844 # --- ABORT [33144] 18 Feb 18:18:44.844 # --- STACK TRACE redis-server.exe!LogStackTrace(c:\release\redis\src\win32_interop\win32_stacktrace.cpp:95)(0x00000016, 0x042E0028, 0x00000000, 0x00000001) redis-server.exe!AbortHandler(c:\release\redis\src\win32_interop\win32_stacktrace.cpp:206)(0x00000001, 0x89EE7767, 0x40150880, 0xBB7A5ED7) redis-server.exe!raise(f:\dd\vctools\crt\crtw32\misc\winsig.c:587)(0x00000001, 0x00000000, 0x0004B4E9, 0x042E0028) redis-server.exe!abort(f:\dd\vctools\crt\crtw32\misc\abort.c:82)(0x00000001, 0x4013F888, 0x0004B4E9, 0x00008000) redis-server.exe!redisOutOfMemoryHandler(c:\release\redis\src\redis.c:3397)(0x0004B4E9, 0x4007DA07, 0x042E0028, 0x4007A27B) redis-server.exe!zmalloc(c:\release\redis\src\zmalloc.c:147)(0xBDF01150, 0x4007EB2C, 0xBDF01150, 0x446D6B10) redis-server.exe!sdsnewlen(c:\release\redis\src\sds.c:59)(0xBDF01150, 0xBDF01150, 0x3E74FD95, 0x00000003) redis-server.exe!_addReplyStringToList(c:\release\redis\src\networking.c:271)(0xBDF01150, 0xBDF01150, 0x042E0028, 0x400E34FE) redis-server.exe!addReplyBulkCBuffer(c:\release\redis\src\networking.c:517)(0xFFFFFFFF, 0x042E0028, 0x01B77260, 0x01B77260) redis-server.exe!luaReplyToRedisReply(c:\release\redis\src\scripting.c:792)(0x00000004, 0xBDF01150, 0x00000002, 0x00000002) redis-server.exe!luaReplyToRedisReply(c:\release\redis\src\scripting.c:839)(0xFFFFFFFF, 0x00A7F690, 0x67897B20, 0xBDF01150) redis-server.exe!evalGenericCommand(c:\release\redis\src\scripting.c:1048)(0x71E66870, 0x00000000, 0x00000001, 0x000000B2) redis-server.exe!call(c:\release\redis\src\redis.c:2016)(0x56C60B04, 0x4008B000, 0x00000000, 0x000000B2) redis-server.exe!processCommand(c:\release\redis\src\redis.c:2235)(0xBDF01150, 0x000000B2, 0x000023B5, 0x00000001) redis-server.exe!processInputBuffer(c:\release\redis\src\networking.c:1274)(0xBDF01150, 0x00000000, 0x00000000, 0x00000001) redis-server.exe!readQueryFromClient(c:\release\redis\src\networking.c:1329)(0xFFE51650, 0x00000001, 0x44726F20, 0x0000012C) redis-server.exe!aeMain(c:\release\redis\src\ae.c:487)(0x56C5C7F8, 0x00000002, 0x00000000, 0x00000002) redis-server.exe!redis_main(c:\release\redis\src\redis.c:3524)(0x0024BA50, 0x00000002, 0x56C5C7EB, 0x00000002) redis-server.exe!main(c:\release\redis\src\win32_interop\win32_qfork.cpp:1363)(0x00000016, 0xFFFFFFFF, 0x00000016, 0x0023F3A0) redis-server.exe!ServiceWorkerThread(c:\release\redis\src\win32_interop\win32_service.cpp:485)(0x4000B3D0, 0x00000000, 0x00000000, 0x00000000) KERNEL32.DLL!BaseThreadInitThunk(c:\release\redis\src\win32_interop\win32_service.cpp:485)(0xBB0113B0, 0x00000000, 0x00000000, 0x00000000) ntdll.dll!RtlUserThreadStart(c:\release\redis\src\win32_interop\win32_service.cpp:485)(0x00000000, 0x00000000, 0x00000000, 0x00000000) ntdll.dll!RtlUserThreadStart(c:\release\redis\src\win32_interop\win32_service.cpp:485)(0x00000000, 0x00000000, 0x00000000, 0x00000000) [33144] 18 Feb 18:18:44.857 # === REDIS BUG REPORT END. Make sure to include from START to END. ===

maxheap is set to 3000mb the server has a total of 64 GB RAM and about 10GB were free

There is also one more thing , but i'm not sure wether it is really realted to the problem.

Most of the time the problem increases its frequency. Then when I reset the iis of one of the server the problem is gone for hours or days completly. I thought about there may be hanging / stacking signalR queues. But I don't have any further signs that this may be the case.

Any hints about that?

c#
redis
stackexchange.redis
asked on Stack Overflow Feb 18, 2016 by Boas Enkler • edited Feb 18, 2016 by Boas Enkler

1 Answer

0

I found the solution it was not related to signalR itself but the client signalR Scaleout is using.

I came out that if you youse the default Microsoft.AspNet.SignalR.Redis package that it includes private refences to an old StackExchange.Redis client.

this client had issue in releasing client connection handles. Now when restarting the IIS or the redis server all this handles are freed up and everything runs again.

One solution was to build an own signalR Scaleout stream implementation

the other (which was easier for us) was to just disable the signalR scaleout stream.

All other components which are accessing redis are using ServiceStack.Redis which works fine.

Now about one month later we didn't have any more issues with the redis server.

answered on Stack Overflow Mar 31, 2016 by Boas Enkler

User contributions licensed under CC BY-SA 3.0