Package home | Report new bug | New search | Development Roadmap Status: Open | Feedback | All | Closed Since Version 1.2.2

Bug #7921 inline renderer hangs indefinitely on strings with \0
Submitted: 2006-06-16 12:22 UTC
From: mrten+pear at ii dot nl Assigned: chagenbu
Status: Closed Package: Text_Diff
PHP Version: 4.3.4 OS: linux
Roadmaps: (Not assigned)    

 [2006-06-16 12:22 UTC] mrten+pear at ii dot nl (Mrten)
Description: ------------ preliminary report: if one would compare two strings where one string contains a \0 (zero byte), the inline renderer hangs (and terminates with a memory overflow) in _splitOnWords() in Diff/Renderer/inline.php. this happens because both strspn() and strcspn() in the while-loop return 0 (is perhaps a php-bug? ISTR that the C-counterpart has the same problem), so $spaces and $nextpos are both 0, which causes $pos to never be incremented. $words[] does keep getting larger and larger, which terminates the script in the end with a fatal memory overflow. i've temporarily patched it with $string = str_replace("\0",'',$string); at the start of the function.


 [2006-06-16 12:26 UTC] yunosh (Jan Schneider)
The package is called Text_Diff, i.e. it's supposed to create diffs for texts, not binary data.
 [2006-06-16 12:41 UTC] mrten+pear at ii dot nl
that is, to say the least, a silly response to a one-line patch. if you really dont want to add the fix, please add to the documentation a remark that ones script will terminate with a fatal memory error if there happens to be a \0 in the data. (im not comparing binaries, the fact that there is a \0 in the -textual- data is considered a bug indeed!) but crashing php like that is just plain silly.
 [2006-06-16 14:07 UTC] chagenbu at php dot net (Chuck Hagenbuch)
This bug has been fixed in CVS. If this was a documentation problem, the fix will appear on by the end of next Sunday (CET). If this was a problem with the website, the change should be live shortly. Otherwise, the fix will appear in the package's next release. Thank you for the report and for helping us make PEAR better. I agree with the reporter; we don't have to do anything /nice/ with bad data, but not crashing is good.