• Toll-free  888-665-8637
  • International  +1 717-220-0012
Welcome Guest! To enable all features please Login or Register.

Notification

Icon
Error

Coleen
#1 Posted : Monday, December 10, 2007 11:11:38 AM(UTC)
Coleen

Rank: Member

Joined: 4/30/2007(UTC)
Posts: 383

[table id=ctl00_MainContent_gvEvents style="WIDTH: 100%; BORDER-COLLAPSE: collapse" cellSpacing=0 cellPadding=5 border=0]
[tr ][td vAlign=top align=left]Error[/td][td vAlign=top align=left]<br>Session:<br>To:http://www.us.com/WebResource.axd?d=mT4rycEdteJmVeccR0AbYg2&t=633197198131449088 <br>From:http://72.14.205.104/search?q=cache:jXWHvVUYkV4J:www.us.com/Folios-Notebooks.aspx+promotional+portfolios,+journals+and+jotters&hl=en&ct=clnk&cd=42&gl=us<br>User:207.237.210.168(207.237.210.168)<br>Agent:Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.11) Gecko/20071127 Firefox/2.0.0.11[/td][/tr][/table]
Coleen
#2 Posted : Monday, December 10, 2007 11:12:35 AM(UTC)
Coleen

Rank: Member

Joined: 4/30/2007(UTC)
Posts: 383

System.Web This is an invalid webresource request.[ at System.Web.Handlers.AssemblyResourceLoader.System.Web.IHttpHandler.ProcessRequest(HttpContext context) at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute() at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously) ]
Coleen
#3 Posted : Monday, December 10, 2007 11:20:35 AM(UTC)
Coleen

Rank: Member

Joined: 4/30/2007(UTC)
Posts: 383

My conclusion is that the "Padding is invalid and cannot be removed" error is probably being caused by the users session expiring and then the browser is loading a cached version of the page, but is still requesting the script resource referenced in the cached page from the server.

The padding is invalid and cannot be removed error is appearing over and over in the logs the last few days with no changes other than the SP3 and search updates. I'm wondering if this somehow relates.


http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=271482&SiteID=1
Coleen
#4 Posted : Monday, December 10, 2007 12:22:02 PM(UTC)
Coleen

Rank: Member

Joined: 4/30/2007(UTC)
Posts: 383

The googlebot scanning the site is getting the error...FWIW.

mscorlib Padding is invalid and cannot be removed.[ at System.Security.Cryptography.RijndaelManagedTransform.DecryptData(Byte[] inputBuffer, Int32 inputOffset, Int32 inputCount, Byte[]& outputBuffer, Int32 outputOffset, PaddingMode paddingMode, Boolean fLast) at System.Security.Cryptography.RijndaelManagedTransform.TransformFinalBlock(Byte[] inputBuffer, Int32 inputOffset, Int32 inputCount) at System.Security.Cryptography.CryptoStream.FlushFinalBlock() at System.Web.Configuration.MachineKeySection.EncryptOrDecryptData(Boolean fEncrypt, Byte[] buf, Byte[] modifier, Int32 start, Int32 length, Boolean useValidationSymAlgo) at System.Web.UI.Page.DecryptString(String s) at System.Web.Handlers.AssemblyResourceLoader.System.Web.IHttpHandler.ProcessRequest(HttpContext context) at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute() at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously) ]
12/10/2007 11:00:50 AM Error <br>Session:<br>To:http://www.us.com/WebResource.axd?d=miuRP77sVimDS0namjC3IyapXoAa5D03qp3ZFjgP6Qs1&amp;t=633320533915656272 <br>From:<br>User:66.249.73.137(66.249.73.137)<br>Agent:Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
Andy Miller
#5 Posted : Monday, December 10, 2007 1:14:02 PM(UTC)
Andy Miller

Rank: Member

Joined: 11/5/2003(UTC)
Posts: 2,136

Was thanked: 1 time(s) in 1 post(s)
WebResource.axd is used to retrieve some images and javascript that are embedded within a DLL. My guess is that Google is trying to retrieve something that no longer exists...kind of like a 404. Since the stuff that is retrieved by WebResource.axd is pretty useless to search engines, you can safely exclude all WebResource.axd URL's from indexing using your robots file.
Andy Miller
Structured Solutions

Shipper 3 - High Velocity Shipment Processing
Coleen
#6 Posted : Tuesday, December 11, 2007 11:27:12 PM(UTC)
Coleen

Rank: Member

Joined: 4/30/2007(UTC)
Posts: 383

User-agent: *
Disallow: /*.axd$


Seems to be doing the trick, thanks Andy.
jetheredge
#7 Posted : Wednesday, December 12, 2007 8:33:13 AM(UTC)
jetheredge

Rank: Member

Joined: 3/1/2006(UTC)
Posts: 1,142

Nice tip, thanks Andy.
Justin Etheredge
Senior Software Engineer
BVSoftware
tmissey
#8 Posted : Wednesday, December 19, 2007 3:16:41 PM(UTC)
tmissey

Rank: Member

Joined: 10/29/2007(UTC)
Posts: 33

Hello, I am also seeing the webresource.axd errors in the event log.


I added the suggested:

Disallow: /*.axd$



to the robots.txt, but I am still seeing the errors as the bots go by. Is anyone else still see the errors after the robots.txt modification, or does anyone have additional suggestions?
Coleen
#9 Posted : Wednesday, December 19, 2007 7:59:13 PM(UTC)
Coleen

Rank: Member

Joined: 4/30/2007(UTC)
Posts: 383

Not working for me either, thought it was, but isn't.
Andy Miller
#10 Posted : Thursday, December 20, 2007 2:47:19 AM(UTC)
Andy Miller

Rank: Member

Joined: 11/5/2003(UTC)
Posts: 2,136

Was thanked: 1 time(s) in 1 post(s)
I used the robots.txt tester in Google Webmaster Tools to see if that robots.txt file worked. It did not block the sample URL shown in the error message at the top of this thread, but this one did:

User-agent: *
Disallow: /WebResource.axd

Google's tester is pretty cool. You can test different agents, urls, etc.
Andy Miller
Structured Solutions

Shipper 3 - High Velocity Shipment Processing
tmissey
#11 Posted : Thursday, December 20, 2007 9:23:10 AM(UTC)
tmissey

Rank: Member

Joined: 10/29/2007(UTC)
Posts: 33

A good tool indeed. Helped me narrow down to having to have:

Disallow: /*.axd$ and to include the rest like "webresource.axd?blahblahblah I used

Disallow: /*.axd?

The first one was needed to catch only webresource.axd anything after that with "?blah" would need to be caught with the second one. Not a big fan of using a wild card but in this case it worked.



Thanks Andy!
Dan @ Wolfe
#12 Posted : Sunday, December 30, 2007 9:50:24 AM(UTC)
Dan @ Wolfe

Rank: Member

Joined: 8/8/2007(UTC)
Posts: 298

So that is the best way to create a robots.txt file and where is the best place to put it in my site?

Dan
Dan
MitchA
#13 Posted : Sunday, December 30, 2007 10:00:00 AM(UTC)
MitchA

Rank: Member

Joined: 3/3/2006(UTC)
Posts: 1,737

Dan, it's dirt simple and it goes in the root. Just text created in Notepad.

http://www.searchtools.com/robots/robots-txt.html
Optimists invent airplanes,
Pessimists buy parachutes.
birdsafe
#14 Posted : Sunday, December 30, 2007 10:12:56 AM(UTC)
birdsafe

Rank: Member

Joined: 2/21/2007(UTC)
Posts: 1,113

I wish there was a way to filter out the "bogus" directory listings in the Event Log where one of the directories is getting added to the path somehow (BV said it was an issue they were working on), and so the bots can't find the pages because they are not in the *added* directory path. I'm not sure what it does, if anything, to SEO, but it makes the Event Log pretty useless, fills up with hundreds of these entries daily.
Forum Jump  
You cannot post new topics in this forum.
You cannot reply to topics in this forum.
You cannot delete your posts in this forum.
You cannot edit your posts in this forum.
You cannot create polls in this forum.
You cannot vote in polls in this forum.

©2024 Develisys. All rights reserved.
  • Toll-free  888-665-8637
  • International  +1 717-220-0012