I have written a custom std::basic_streambuf
and std::basic_ostream
because I want an output stream that I can get a JNI string from in a manner similar to how you can call std::ostringstream::str()
. These classes are quite simple.
namespace myns {
class jni_utf16_streambuf : public std::basic_streambuf<char16_t>
{
JNIEnv * d_env;
std::vector<char16_t> d_buf;
virtual int_type overflow(int_type);
public:
jni_utf16_streambuf(JNIEnv *);
jstring jstr() const;
};
typedef std::basic_ostream<char16_t, std::char_traits<char16_t>> utf16_ostream;
class jni_utf16_ostream : public utf16_ostream
{
jni_utf16_streambuf d_buf;
public:
jni_utf16_ostream(JNIEnv *);
jstring jstr() const;
};
// ...
} // namespace myns
In addition, I have made four overloads of operator<<
, all in the same namespace:
namespace myns {
// ...
utf16_ostream& operator<<(utf16_ostream&, jstring) throw(std::bad_cast);
utf16_ostream& operator<<(utf16_ostream&, const char *);
utf16_ostream& operator<<(utf16_ostream&, const jni_utf16_string_region&);
jni_utf16_ostream& operator<<(jni_utf16_ostream&, jstring);
// ...
} // namespace myns
The implementation of jni_utf16_streambuf::overflow(int_type)
is trivial. It just doubles the buffer width, puts the requested character, and sets the base, put, and end pointers correctly. It is tested and I am quite sure it works.
The jni_utf16_ostream
works fine inserting unicode characters. For example, this works fine and results in the stream containing "hello, world":
myns::jni_utf16_ostream o(env);
o << u"hello, wor" << u'l' << u'd';
My problem is as soon as I try to insert an integer value, the stream's bad bit gets set, for example:
myns::jni_utf16_ostream o(env);
if (o.badbit()) throw "bad bit before"; // does not throw
int32_t x(5);
o << x;
if (o.badbit()) throw "bad bit after"; // throws :(
I don't understand why this is happening! Is there some other method on std::basic_streambuf
I need to be implementing????