Skip corrupt log entries


Author
Message
Brad Konia
Brad Konia
Forum Expert (806 reputation)Forum Expert (806 reputation)Forum Expert (806 reputation)Forum Expert (806 reputation)Forum Expert (806 reputation)Forum Expert (806 reputation)Forum Expert (806 reputation)Forum Expert (806 reputation)Forum Expert (806 reputation)
Group: Forum Members
Posts: 70, Visits: 111
Hi Toby,

I don't see any problem with skipping corrupt entries, particularly if the corrupt entry appears in the grid, marked as corrupt. Instead of displaying it nicely formatted into columns, it could display the raw JSON in a single column. That way there would be no data lost and if the user needs to access the data, he could either parse it manually, or fix the error. This would be far better than the current approach where it crashes on a corrupt entry.
LogViewPlus Support
LogViewPlus Support
Supreme Being (12K reputation)Supreme Being (12K reputation)Supreme Being (12K reputation)Supreme Being (12K reputation)Supreme Being (12K reputation)Supreme Being (12K reputation)Supreme Being (12K reputation)Supreme Being (12K reputation)Supreme Being (12K reputation)
Group: Moderators
Posts: 1.2K, Visits: 4.3K
Hi Brad,

Thanks for the feedback. 

We can look into revisiting the JSON and XML parsers to skip and report on invalid log entries.  If we do this, it is likely that some valid log entries will also be skipped ask the parser attempts to reset with the next valid entry,  If the parser is unable to do this, all remaining log entries will be skipped.  So, if your JSON log entries a valid, a better solution may be to address underlying JSON parsing issue.  If you are able to isolate the parsing issue or clean up the log file, please feel free to contact me directly.

We use JSON.Net internally, which is a very popular JSON parser. 

Thanks,

Toby
Brad Konia
Brad Konia
Forum Expert (806 reputation)Forum Expert (806 reputation)Forum Expert (806 reputation)Forum Expert (806 reputation)Forum Expert (806 reputation)Forum Expert (806 reputation)Forum Expert (806 reputation)Forum Expert (806 reputation)Forum Expert (806 reputation)
Group: Forum Members
Posts: 70, Visits: 111
Using a generic parser is not an option for deeply nested, complex JSON objects. I agree that we need an option to skip corrupt records. If the parser encounters a corrupt record, it can display the record in the grid perhaps with some indicator to show that it's corrupt. This should be an option that's up to the individual user.
Also, the problem I'm experiencing is that your JSON parser is buggy and often crashes on perfectly valid records, reporting them as invalid JSON. I know the issue is the parser because:
  1. I've tried validating the supposedly invalid records in an external JSON validator and they validate just fine.
  2. I have a 36 MB log file and each time I load the file, it crashes on a different record.
Probably the reason others haven't reported this is because most people aren't parsing such large, complex JSON structures. Unfortunately, I can't send you a test file because it contains sensitive information.
Perhaps you could check to see if there are updates to the JSON parser library. For my purposes though, the most important thing is have the option to bypass records the parser thinks are corrupt.

LogViewPlus Support
LogViewPlus Support
Supreme Being (12K reputation)Supreme Being (12K reputation)Supreme Being (12K reputation)Supreme Being (12K reputation)Supreme Being (12K reputation)Supreme Being (12K reputation)Supreme Being (12K reputation)Supreme Being (12K reputation)Supreme Being (12K reputation)
Group: Moderators
Posts: 1.2K, Visits: 4.3K
Glad to hear the solution we discussed offline helped - thanks for letting me know!

For anyone else finding this post, when a strict parser such as XML or JSON is being used and the underlying log entry is invalid, the entire file will fail to parse.  This is by design to ensure no data is lost when searching the file.

As highlighted above, the work-around for this behaviour is to use a more generic parser type.
May Fly
May Fly
Junior Member (67 reputation)Junior Member (67 reputation)Junior Member (67 reputation)Junior Member (67 reputation)Junior Member (67 reputation)Junior Member (67 reputation)Junior Member (67 reputation)Junior Member (67 reputation)Junior Member (67 reputation)
Group: Forum Members
Posts: 5, Visits: 14
The solution for me is to use the "Pattern Parser" instead of the "XML Parser". Thanks for your help
LogViewPlus Support
LogViewPlus Support
Supreme Being (12K reputation)Supreme Being (12K reputation)Supreme Being (12K reputation)Supreme Being (12K reputation)Supreme Being (12K reputation)Supreme Being (12K reputation)Supreme Being (12K reputation)Supreme Being (12K reputation)Supreme Being (12K reputation)
Group: Moderators
Posts: 1.2K, Visits: 4.3K
Skipping log items isn't a very good solution as it might lead to important data being hidden.  

It is not clear why the corrupt items are causing issues as LogViewPlus should just append the invalid log entry to the previous entry.  Are you able send me a sample log entry and more information on how the log file is being parsed?
May Fly
May Fly
Junior Member (67 reputation)Junior Member (67 reputation)Junior Member (67 reputation)Junior Member (67 reputation)Junior Member (67 reputation)Junior Member (67 reputation)Junior Member (67 reputation)Junior Member (67 reputation)Junior Member (67 reputation)
Group: Forum Members
Posts: 5, Visits: 14
Our logger seems to have a threading problem so it happens the one log item (log4net) is interrupted by another one. So I have to remove these corrupt item manually from the logfile and retry parsing. Log4View can handle this by simply skipping the broken item. It would be great if LogViewPlus would also be able to do so.
LogViewPlus Support
LogViewPlus Support
Supreme Being (12K reputation)Supreme Being (12K reputation)Supreme Being (12K reputation)Supreme Being (12K reputation)Supreme Being (12K reputation)Supreme Being (12K reputation)Supreme Being (12K reputation)Supreme Being (12K reputation)Supreme Being (12K reputation)
Group: Moderators
Posts: 1.2K, Visits: 4.3K
Hi May,

Usually what LogViewPlus does in this case is append the corrupt log entry on to the previous log entry.  If you can tell me more about the cancellation, I may be able to help.  Would it be possible to post a sample log entry?

Thanks,

Toby
May Fly
May Fly
Junior Member (67 reputation)Junior Member (67 reputation)Junior Member (67 reputation)Junior Member (67 reputation)Junior Member (67 reputation)Junior Member (67 reputation)Junior Member (67 reputation)Junior Member (67 reputation)Junior Member (67 reputation)
Group: Forum Members
Posts: 5, Visits: 14
It would be great if LogViewPlus would not cancel parsing when it encounters a corrupt log entry but just try to skip the broken entry and continue with the next. This is the way "Log4View" works.

-Matej
GO

Merge Selected

Merge into selected topic...



Merge into merge target...



Merge into a specific topic ID...




Similar Topics

Login

Explore
Messages
Mentions
Search