Function peekDelimiterInclusive [src]

Returns a slice of the next bytes of buffered data from the stream until delimiter is found, without advancing the seek position. Returned slice includes the delimiter as the last byte. Invalidates previously returned values from peek. See also: peekSentinel peekDelimiterExclusive takeDelimiterInclusive

Prototype

pub fn peekDelimiterInclusive(r: *Reader, delimiter: u8) DelimiterError![]u8

Parameters

r: *Readerdelimiter: u8

Possible Errors

EndOfStream

For "inclusive" functions, stream ended before the delimiter was found. For "exclusive" functions, stream ended and there are no more bytes to return.

ReadFailed

See the Reader implementation for detailed diagnostics.

StreamTooLong

The delimiter was not found within a number of bytes matching the capacity of the Reader.

Example

test peekDelimiterInclusive { var r: Reader = .fixed("ab\nc"); try testing.expectEqualStrings("ab\n", try r.peekDelimiterInclusive('\n')); try testing.expectEqualStrings("ab\n", try r.peekDelimiterInclusive('\n')); r.toss(3); try testing.expectError(error.EndOfStream, r.peekDelimiterInclusive('\n')); }

Source

pub fn peekDelimiterInclusive(r: *Reader, delimiter: u8) DelimiterError![]u8 { const buffer = r.buffer[0..r.end]; const seek = r.seek; if (std.mem.indexOfScalarPos(u8, buffer, seek, delimiter)) |end| { @branchHint(.likely); return buffer[seek .. end + 1]; } // TODO take a parameter for max search length rather than relying on buffer capacity try rebase(r, r.buffer.len); while (r.buffer.len - r.end != 0) { const end_cap = r.buffer[r.end..]; var writer: Writer = .fixed(end_cap); const n = r.vtable.stream(r, &writer, .limited(end_cap.len)) catch |err| switch (err) { error.WriteFailed => unreachable, else => |e| return e, }; r.end += n; if (std.mem.indexOfScalarPos(u8, end_cap[0..n], 0, delimiter)) |end| { return r.buffer[0 .. r.end - n + end + 1]; } } return error.StreamTooLong; }