I am trying to tokenize lines in a file using _tcstok. I am able to tokenize the line once, but when i try to tokenize it a second time, I get an access violation. I feel like it has something to do with not actually accessing the values, but locations instead. I'm not sure how else to do this though.
Thanks,
Dave
p.s. I'm using TCHAR and _tcstok because the file is UTF-8.
This is the error I'm getting:
First-chance exception at 0x63e866b4 (msvcr90d.dll) in Testing.exe: 0xC0000005: Access violation reading location 0x0000006c.
vector<TCHAR> TabDelimitedSource::getNext() {
// Returns the next document (a given cell) from the file(s)
TCHAR row[256]; // Return NULL if no more documents/rows
vector<TCHAR> document;
try{
//Read each line in the file, corresponding to and individual document
buff_reader->getline(row,10000);
}
catch (ifstream::failure e){
; // Ignore and fall through
}
if (_tcslen(row)>0){
this->current_row += 1;
vector<TCHAR> cells;
//Separate the line on tabs (id 'tab' document title 'tab' document body)
TCHAR * pch;
pch = _tcstok(row,"\t");
while (pch != NULL){
cells.push_back(*pch);
pch = _tcstok(NULL, "\t");
}
// Split the cell into individual words using the lucene analyzer
try{
//Separate the body by spaces
TCHAR original_document ;
original_document = (cells[column_holding_doc]);
try{
TCHAR * pc;
pc = _tcstok((char*)original_document," ");
while (pch != NULL){
document.push_back(*pc);
pc = _tcstok(NULL, "\t");
}
First up, your code is a mongrel mixture of C string manipulation and C++ containers. This will just dig you into a hole. Ideally you should tokenize the line into std::vector<std::wstring>
Also, you're very confused about TCHAR
and UTF-8. TCHAR
is a character type that 'floats' between 8 and 16 bits depending on compile time flags. UTF-8 files use between one and four bytes to represent each character. So, you probably want to hold the text as std::wstring
objects, but you're going to need to explicitly convert the UTF-8 into wstrings.
But, if you just want to get anything working, focus on your tokenization. You need to store the address of the start of each token (as a TCHAR*
) but your vector is a vector of TCHAR
s instead. When you try to use the token data, you're casting TCHAR
s to TCHAR*
pointers, with the unsurprising result of access violations. The AV address you give is 0x0000006c
, which is ASCII code for the character l
.
vector<TCHAR*> cells;
...
cells.push_back(pch);
... and then...
TCHAR *original_document = cells[column_holding_doc];
TCHAR *pc = _tcstok(original_document," ");
User contributions licensed under CC BY-SA 3.0