bugPSPP - Bugs: bug #54664, segfault in count_newlines in...

 
 

bug #54664: segfault in count_newlines in lexer.c

Submitter:  Tianxiao Gu <tianxiaogu>
Submitted:  Sat 15 Sep 2018 07:41:18 AM UTC
   
 
Category:  Syntax Parser Severity:  5 - Average
Status:  Fixed Assigned to:  None
Open/Closed:  Closed Release:  None
Effort:  0.00
* Mandatory Fields

Add a New Comment Rich Markup
   

Mon 24 Sep 2018 05:50:39 AM UTC, comment #2: 

Thanks for the bug report.  I applied a fix to the PSPP master branch.

Ben Pfaff <blp>
Group administrator
Tue 18 Sep 2018 11:40:52 PM UTC, comment #1: 

We found that token->token_len is used without initialization.

The struct `token` is created at line 1374.
By analyzing the code of `lex_push_token__`, we found that the memory of token is allocated via `xnmalloc` in dequeu.c. So the memory may not be set to zero.
Then, the following initialization code (line 1377, 1378, 1379, 1381, 1383) does not set a proper value to `token->token_len`.
`token_len` is finally set to a proper value at 1457.

But in `lex_source_read__` (at line 1398), the toekn_len will be used during reporting parsing error, which will lead to the crash.

1374   /* Append a new token to SRC and initialize it. */
1375   struct lex_token *token = lex_push_token__ (src);
1376   struct scanner scanner;
1377   scanner_init (&scanner, &token->token);
1378   token->line_pos = src->line_pos;
1379   token->token_pos = src->seg_pos;
1380   if (src->reader->line_number > 0)
1381     token->first_line = src->reader->line_number + src->n_newlines;
1382   else
1383     token->first_line = 0;
1384
1385   /* Extract segments and pass them through the scanner until we obtain a
1386      token. */
1387   for (;;)
1388     {
1389       /* Extract a segment. */
1390       const char *segment = &src->buffer[state.seg_pos - src->tail];
1391       size_t seg_maxlen = src->head - state.seg_pos;
1392       enum segment_type type;
1393       int seg_len = segmenter_push (&state.segmenter, segment, seg_maxlen,
1394                                     &type);
1395       if (seg_len < 0)
1396         {
1397           /* The segmenter needs more input to produce a segment. */
1398           lex_source_read__ (src);
1399           continue;
1400         }
1401    
1402       /* Update state based on the segment. */
1403       state.last_segment = type;
1404       state.seg_pos += seg_len;
1405       if (type == SEG_NEWLINE)
1406         {
1407           state.newlines++;
1408           state.line_pos = state.seg_pos;
1409         }
1410    
1411       /* Pass the segment into the scanner and try to get a token out. */
1412       enum scan_result result = scanner_push (&scanner, type,
1413                                               ss_buffer (segment, seg_len),
1414                                               &token->token);
1415       if (result == SCAN_SAVE)
1416         saved = state;
1417       else if (result == SCAN_BACK)
1418         {
1419           state = saved;
1420           break;
1421         }
1422       else if (result == SCAN_DONE)
1423         break;
1424     }
1425
1426   /* If we've reached the end of a line, or the end of a command, then pass
1427      the line to the output engine as a syntax text item.  */
1428   int n_lines = state.newlines;
1429   if (state.last_segment == SEG_END_COMMAND && !src->suppress_next_newline)
1430     {
1431       n_lines++;
1432       src->suppress_next_newline = true;
1433     }
1434   else if (n_lines > 0 && src->suppress_next_newline)
1435     {
1436       n_lines--;
1437       src->suppress_next_newline = false;
1438     }
1439   for (int i = 0; i < n_lines; i++)
1440     {
1441       const char *line = &src->buffer[src->journal_pos - src->tail];
1442       const char *newline = rawmemchr (line, '\n');
1443       size_t line_len = newline - line;
1444       if (line_len > 0 && line[line_len - 1] == '\r')
1445         line_len--;
1446
1447       char *syntax = malloc (line_len + 2);
1448       memcpy (syntax, line, line_len);
1449       syntax[line_len] = '\n';
1450       syntax[line_len + 1] = '\0';
1451
1452       text_item_submit (text_item_create_nocopy (TEXT_ITEM_SYNTAX, syntax));
1453
1454       src->journal_pos += newline - line + 1;
1455     }
1456
1457   token->token_len = state.seg_pos - src->seg_pos;

Tianxiao Gu <tianxiaogu>
Sat 15 Sep 2018 07:41:18 AM UTC, original submission:  

When compiling pspp with address sanitizer, we can trigger the following segfault.
When compiling pspp without address sanitizer, we cannot.

Reproduce:

./src/ui/terminal/pspp test-case0

=================================================================
==1955==ERROR: AddressSanitizer: SEGV on unknown address 0x614000010000 (pc 0x7f4f3fa4c540 bp 0x7fff1fae28a0 sp 0x7fff1fae2018 T0)
==1955==The signal is caused by a READ memory access.
    #0 0x7f4f3fa4c53f  (/lib/x86_64-linux-gnu/libc.so.6+0x18a53f)
    #1 0x7f4f411645a1  (/usr/lib/x86_64-linux-gnu/libasan.so.4+0x415a1)
    #2 0x7f4f40c52dfe in count_newlines src/language/lexer/lexer.c:906
    #3 0x7f4f40c52f85 in lex_source_get_last_line_number src/language/lexer/lexer.c:926
    #4 0x7f4f40c534a9 in lex_get_last_line_number src/language/lexer/lexer.c:1003
    #5 0x55f3ba564627 in output_msg src/ui/terminal/main.c:226
    #6 0x7f4f407f7314 in ship_message src/libpspp/message.c:283
    #7 0x7f4f407f76df in submit_note src/libpspp/message.c:309
    #8 0x7f4f407f7ad7 in process_msg src/libpspp/message.c:349
    #9 0x7f4f407f7b39 in msg_emit src/libpspp/message.c:363
    #10 0x7f4f40c5443c in lex_source_read__ src/language/lexer/lexer.c:1206
    #11 0x7f4f40c55d4d in lex_source_get__ src/language/lexer/lexer.c:1398
    #12 0x7f4f40c50c29 in lex_get src/language/lexer/lexer.c:228
    #13 0x55f3ba564051 in main src/ui/terminal/main.c:135
    #14 0x7f4f3f8e3b96 in __libc_start_main (/lib/x86_64-linux-gnu/libc.so.6+0x21b96)
    #15 0x55f3ba563ac9 in _start (/home/t/Projects/fuzzing/pspp/pspp/src/ui/terminal/.libs/pspp+0x4ac9)

AddressSanitizer can not provide additional info.
SUMMARY: AddressSanitizer: SEGV (/lib/x86_64-linux-gnu/libc.so.6+0x18a53f)
==1955==ABORTING

Tianxiao Gu <tianxiaogu>

 

(Note: upload size limit is set to 16384 kB, after insertion of the required escape characters.)

Attach Files:
   
   
Comment:
   

Attached Files
file #45015:  test-case-0 added by tianxiaogu (339B - application/octet-stream)

 

Depends on the following items: None found

Items that depend on this one: None found

 

Carbon-Copy List
  • -email is unavailable- added by blp (Posted a comment)
  • -email is unavailable- added by tianxiaogu (Submitted the item)
  •  

    There are 0 votes so far. Votes easily highlight which items people would like to see resolved in priority, independently of the priority of the item set by tracker managers.

    Only logged-in users can vote.

     

    Follow 3 latest changes.

    Date Changed by Updated Field Previous Value => Replaced by
    2018-09-24 blp StatusNone Fixed
        Open/ClosedOpen Closed
    2018-09-15 tianxiaogu Attached File- Added test-case-0, #45015

    Back to the top

    Powered by Savane 3.13-02a9.
    Corresponding source code