5 minute read

After making some changes to their domains one of my customers noticed that their ASP.NET 2.0 applications didn’t respond anymore. HTML pages and images were serving just fine as well as their ASP.NET 1.1. sites, but all their 2.0 sites on all servers seemed to hang when browsing to them in IE.

Looking at the IIS logs they noticed that they had two 401 requests to their pages but never got the 200 OK request for them which is usually the case when a page is served ok.

When they ran aspnet_regiis /i to reregister ASP.NET 2.0 they ended up with a message, both in the event viewer and the aspnet installation logs saying

Failed while restarting W3SVC. Error code: 0x8007041d

0x8007041d stands for ERROR SERVICE REQUEST TIMEOUT so not much help there…

We took a hang dump of the issue with adplus -hang w3wp.exe when it was hanging to figure out what was going on. If you are just interested in the solution, feel free to scroll to the bottom of the post, but if you are a debugging geek, here comes the nitty-gritty.

Debugging the dumps

The first thing that was peculiar was that !threads returned the following error

0:005> !threads
Failed to request ThreadStore

This usually means that the .net runtime has not been loaded in the process or that the threadpool hasn’t been initialized yet so this pretty much tells us that it hasn’t even gotten to the point where it is serving the asp.net pages.

Running ~* kb 2000 to check out the stacks on all threads we find a few threads sitting in this stack

  14  Id: 13fc.1510 Suspend: 1 Teb: 7ffda000 Unfrozen
ChildEBP RetAddr  Args to Child
01a1fc48 7c822124 7c83970f 000002c0 00000000 ntdll!KiFastSystemCallRet
01a1fc4c 7c83970f 000002c0 00000000 00000000 ntdll!NtWaitForSingleObject+0xc
01a1fc88 7c839620 00000000 00000004 00000344 ntdll!RtlpWaitOnCriticalSection+0x19c
01a1fca8 5a323468 01461ef8 01461e90 5a322a48 ntdll!RtlEnterCriticalSection+0xa8
01a1fdd4 5a322dc9 00000344 00000000 00000000 w3isapi!ISAPI_DLL::Load+0x4b
01a1fdf8 5a3228ca 000c6b00 01a1fe20 00000344 w3isapi!ISAPI_DLL_MANAGER::GetIsapi+0x1cd
01a1fe24 5a3968ff 00dd3ba0 00dd3614 01a1fe50 w3isapi!ProcessIsapiRequest+0x96
01a1fe58 5a3967e0 00000000 00000000 00dd2958 w3core!W3_ISAPI_HANDLER::IsapiDoWork+0x3fd
01a1fe78 5a396764 01a1fee8 00dd2958 00000000 w3core!W3_ISAPI_HANDLER::DoWork+0xb0
01a1fe98 5a3966f4 00dd2958 00000000 01a1fec4 w3core!W3_HANDLER::MainDoWork+0x16e
01a1fea8 5a3966ae 00dd2960 00dd2958 00000001 w3core!W3_CONTEXT::ExecuteCurrentHandler+0x53
01a1fec4 5a396648 00000001 01a1fee8 0025ae50 w3core!W3_CONTEXT::ExecuteHandler+0x51
01a1feec 5a392264 00000000 00000000 00000000 w3core!W3_STATE_HANDLE_REQUEST::DoWork+0x9a
01a1ff10 5a3965ea 00000000 00000000 00000000 w3core!W3_MAIN_CONTEXT::DoWork+0xa6
01a1ff2c 5a36169f 00dd1e90 00dd1e90 5a361650 w3core!W3_MAIN_CONTEXT::OnNewRequest+0x55
01a1ff38 5a361650 00258348 00258354 01a1ff5c w3dt!UL_NATIVE_REQUEST::DoStateProcess+0x48
01a1ff48 5a3616ca 0000047f 00000000 00dd1e94 w3dt!UL_NATIVE_REQUEST::DoWork+0x7f
01a1ff5c 5a3024de 00000000 0000047f 00dd1e94 w3dt!OverlappedCompletionRoutine+0x1a
01a1ff8c 5a3026bc 00000000 0167f808 5a300000 w3tp!THREAD_POOL_DATA::ThreadPoolThread+0x73
01a1ffa0 5a301db9 00258348 00000000 00000000 w3tp!THREAD_POOL_DATA::ThreadPoolThread+0x24
01a1ffb8 77e6608b 0167f808 00000000 00000000 w3tp!THREAD_MANAGER::ThreadManagerThread+0x39
01a1ffec 00000000 5a301d80 0167f808 00000000 kernel32!BaseThreadStart+0x34

So it is trying to process an Isapi request (an asp.net request in this case) and load up an ISAPI_DLL and in doing so it locked up on a critical section…

The first parameter to w3isapi!ISAPI_DLL_MANAGER::GetIsapi (0xc6b00) holds the name of the ISAPI dll being loaded

0:005> du 000c6b00
000c6b00  "C:\WINDOWS\Microsoft.NET\Framewo"
000c6b40  "rk\v2.0.50727\aspnet_isapi.dll"

Now, the question becomes, why is it stuck in this critical section? and to answer that we can run !critlist from SieExtPub.dll to find the owner

0:005> !critlist
CritSec at  1461ef8.  Owned by thread 5.
  Waiting Threads: 4 6 7 11 12 14

or simply run !locks on the critical section (0x1461ef8 - first parameter to RtlEnterCriticalSection)

0:000> !locks 1461ef8

CritSec +1461ef8 at 01461ef8
WaiterWoken        No
LockCount          6
RecursionCount     1
OwningThread       11cc
EntryCount         0
ContentionCount    6
*** Locked

Scanned 716 critical sections
0:000> ~~[11cc]
.  5  Id: 13fc.11cc Suspend: 1 Teb: 7ffd8000 Unfrozen
      Start: w3tp!THREAD_MANAGER::ThreadManagerThread (5a301d80)
      Priority: 0  Priority class: 32  Affinity: 3

In both cases we find out that the owning thread is thread number 5, so we can switch to thread 5 and look at what it is doing.

0:000> ~5s
eax=00d4f5c0 ebx=0000071a ecx=00000003 edx=00000074 esi=000001fd edi=00000000
eip=7c82ed54 esp=00d4e900 ebp=00d4e970 iopl=0         nv up ei pl zr na pe nc
cs=001b  ss=0023  ds=0023  es=0023  fs=003b  gs=0000             efl=00000246
7c82ed54 c3              ret
0:005> kb 2000
ChildEBP RetAddr  Args to Child
00d4e8fc 7c822124 77e6bad8 000001fd 00000001 ntdll!KiFastSystemCallRet
00d4e900 77e6bad8 000001fd 00000001 00000000 ntdll!NtWaitForSingleObject+0xc
00d4e970 77c61c44 000001fd ffffffff 00000001 kernel32!WaitForSingleObjectEx+0xac
00d4e98c 77c61bfa 00090434 00000001 ffffffff rpcrt4!UTIL_WaitForSyncIO+0x20
00d4e9b0 77c59f39 000903fc 00090434 00d4e9e0 rpcrt4!UTIL_GetOverlappedResultEx+0x1d
00d4e9d8 77c5edd5 000903fc 00d4eb24 00d4eb1c rpcrt4!CO_SyncRecv+0x71
00d4ea00 77c631f8 00000000 000c1a38 00000080 rpcrt4!OSF_CCONNECTION::TransSendReceive+0xb8
00d4ea88 77c62f80 000c1a38 00090540 00000001 rpcrt4!OSF_CCONNECTION::SendFragment+0x2ae
00d4eae0 77c62ff3 00000000 00000068 00d4eb24 rpcrt4!OSF_CCALL::SendNextFragment+0x1e2
00d4eb28 77c630c8 00d4ebb0 00d4eb6c 00000001 rpcrt4!OSF_CCALL::FastSendReceive+0x144
00d4eb44 77c63161 00d4ebb0 00d4eb6c 00d4ebb0 rpcrt4!OSF_CCALL::SendReceiveHelper+0x5b
00d4eb74 77c713ca 000c1a50 00d4eb94 77c72c8d rpcrt4!OSF_CCALL::SendReceive+0x41
00d4eb80 77c72c8d 00d4ebb0 77f618b8 00d4ef9c rpcrt4!I_RpcSendReceive+0x24
00d4eb94 77ce219b 00d4ebdc 000c1ab8 00000000 rpcrt4!NdrSendReceive+0x2b
00d4ef7c 77f657e5 77f618b8 77f63f54 00d4ef9c rpcrt4!NdrClientCall2+0x22e
00d4ef94 77f656cd 000c24c8 00000001 00d4f0e8 advapi32!LsarLookupNames3+0x1c
00d4f064 77f65601 000c24c8 00000000 00000001 advapi32!LsaICLookupNames+0xce
00d4f09c 77f654e1 000c24c8 00000000 00000001 advapi32!LsaLookupNames2+0x39
00d4f10c 77f6544c 00000000 00d4f9b4 00000000 advapi32!LookupAccountNameInternal+0x93
00d4f134 6a2afb3c 00000000 00d4f9b4 00000000 advapi32!LookupAccountNameW+0x21
00d4f580 6a2af4f9 00d4f9b4 00d4fbf0 00d4fbf8 webengine!GetNonSpecialPrincipalSID+0x7a
00d4fbb8 6a2af65b 6a2af73c 00d4fbf0 01922710 webengine!GetPrincipalSID+0x746
00d4fc48 6a2b00b5 00d4fc68 00000000 00000000 webengine!CPerfCounterServer::CreateSd+0xb2
00d4fc74 6a2b01d1 00000001 01922710 6a2afec7 webengine!CPerfCounterServer::StartNewPipeInstance+0x2d
00d4fc80 6a2afec7 00000000 00000000 6a2adccc webengine!CPerfCounterServer::InitPerfPipe+0x11
00d4fca8 5a3233f8 00d4fccc 01461e90 00000000 webengine!CPerfCounterServer::Initialize+0x6c
00d4fca8 5a3233f8 00d4fccc 01461e90 00000000 w3isapi!ISAPI_DLL::Load+0x219
00d4fdd4 5a322dc9 00000280 00000000 00000000 w3isapi!ISAPI_DLL::Load+0x219
00d4fdf8 5a3228ca 000bd4a0 00d4fe20 00000280 w3isapi!ISAPI_DLL_MANAGER::GetIsapi+0x1cd
00d4fe24 5a3968ff 00dd7a80 00dd74f4 00d4fe50 w3isapi!ProcessIsapiRequest+0x96
00d4fe58 5a3967e0 00000000 00000000 00dd6838 w3core!W3_ISAPI_HANDLER::IsapiDoWork+0x3fd
00d4fe78 5a396764 00d4fee8 00dd6838 00000000 w3core!W3_ISAPI_HANDLER::DoWork+0xb0
00d4fe98 5a3966f4 00dd6838 00000000 00d4fec4 w3core!W3_HANDLER::MainDoWork+0x16e
00d4fea8 5a3966ae 00dd6840 00dd6838 00000001 w3core!W3_CONTEXT::ExecuteCurrentHandler+0x53
00d4fec4 5a396648 00000001 00d4fee8 0025ae50 w3core!W3_CONTEXT::ExecuteHandler+0x51
00d4feec 5a392264 00000000 00000000 00000000 w3core!W3_STATE_HANDLE_REQUEST::DoWork+0x9a
00d4ff10 5a3965ea 00000000 00000000 00000000 w3core!W3_MAIN_CONTEXT::DoWork+0xa6
00d4ff2c 5a36169f 00dd5d70 00dd5d70 5a361650 w3core!W3_MAIN_CONTEXT::OnNewRequest+0x55
00d4ff38 5a361650 00258348 00258354 00d4ff5c w3dt!UL_NATIVE_REQUEST::DoStateProcess+0x48
00d4ff48 5a3616ca 0000038f 00000000 00dd5d74 w3dt!UL_NATIVE_REQUEST::DoWork+0x7f
00d4ff5c 5a3024de 00000000 0000038f 00dd5d74 w3dt!OverlappedCompletionRoutine+0x1a
00d4ff8c 5a3026bc 00000000 00258440 5a300000 w3tp!THREAD_POOL_DATA::ThreadPoolThread+0x73
00d4ffa0 5a301db9 00258348 00000000 00000000 w3tp!THREAD_POOL_DATA::ThreadPoolThread+0x24
00d4ffb8 77e6608b 00258440 00000000 00000000 w3tp!THREAD_MANAGER::ThreadManagerThread+0x39
00d4ffec 00000000 5a301d80 00258440 00000000 kernel32!BaseThreadStart+0x34

Apparently it is working on loading the ISAPI_DLL and in doing so it needs to initialize some performance counters and look up some account names.

Searching on msdn for LookupAccountName we find that the 2nd parameter to LookupAccountName is a string containing the account name we are trying to look up which in this case turns out to be the account name processidentity.

0:005> du 00d4f9b4
00d4f9b4  "processidentity"

The problem is that the account or group processidentity does not exist so it will try to look up the user in the domain and trusted domains and this can of course take some time. Sometimes it will be a delay of 30 seconds, and sometimes, if there is something funky going on with the domains it might get stuck like in this case.


I searched a bit on LookupAccountName ASP.NET and ProcessIdentity and found a blog post from one of my buddies Jim Cheshire. Apparently an issue in asp.net causes it to go out and look for certain account names that may not exist. The issue is fixed in 2.0 SP1 so you can download that if you are running into this issue, and it is also fixed in a hotfix for 2.0 RTM which will soon be documented in KB944157. As of today the article is still in the process of being published.

Note: if you need a hotfix you can either call in to support and give the kb number or use the automated online system. Hotfix calls are free of charge assuming that that is the only thing the case is about.

Laters, Tess