Let's say I have this python code:
def double_inputs():
while True:
x = yield
yield x * 2
gen = double_inputs()
next(gen)
print(gen.send(1))
It prints "2", just as expected.
I can make a generator in c++20 like that:
#include <coroutine>
template <class T>
struct generator {
struct promise_type;
using coro_handle = std::coroutine_handle<promise_type>;
struct promise_type {
T current_value;
auto get_return_object() { return generator{coro_handle::from_promise(*this)}; }
auto initial_suspend() { return std::suspend_always{}; }
auto final_suspend() { return std::suspend_always{}; }
void unhandled_exception() { std::terminate(); }
auto yield_value(T value) {
current_value = value;
return std::suspend_always{};
}
};
bool next() { return coro ? (coro.resume(), !coro.done()) : false; }
T value() { return coro.promise().current_value; }
generator(generator const & rhs) = delete;
generator(generator &&rhs)
:coro(rhs.coro)
{
rhs.coro = nullptr;
}
~generator() {
if (coro)
coro.destroy();
}
private:
generator(coro_handle h) : coro(h) {}
coro_handle coro;
};
generator<char> hello(){
//TODO:send string here via co_await, but HOW???
std::string word = "hello world";
for(auto &ch:word){
co_yield ch;
}
}
int main(int, char**) {
for (auto i = hello(); i.next(); ) {
std::cout << i.value() << ' ';
}
}
This generator just produces a string letter by letter, but the string is hardcoded in it. In python, it is possible not only to yield something FROM the generator but to yield something TO it too. I believe it could be done via co_await in C++.
I need it to work like this:
generator<char> hello(){
std::string word = co_await producer; // Wait string from producer somehow
for(auto &ch:word){
co_yield ch;
}
}
int main(int, char**) {
auto gen = hello(); //make consumer
producer("hello world"); //produce string
for (; gen.next(); ) {
std::cout << gen.value() << ' '; //consume string letter by letter
}
}
How can I achieve that? How to make this "producer" using c++20 coroutines?
You have essentially two problems to overcome if you want to do this.
The first is that C++ is a statically typed language. This means that the types of everything involved need to be known at compile time. This is why your generator type needs to be a template, so that the user can specify what type it shepherds from the coroutine to the caller.
So if you want to have this bi-directional interface, then something on your hello function must specify both the output type and the input type.
The simplest way to go about this is to just create an object and pass a non-const reference to that object to the generator. Each time it does a co_yield, the caller can modify the referenced object and then ask for a new value. The coroutine can read from the reference and see the given data.
However, if you insist on using the future type for the coroutine as both output and input, then you need to both solve the first problem (by making your generator template take OutputType and InputType) as well as this second problem.
See, your goal is to get a value to the coroutine. The problem is that the source of that value (the function calling your coroutine) has a future object. But the coroutine cannot access the future object. Nor can it access the promise object that the future references.
Or at least, it can't do so easily.
There are two ways to go about this, with different use cases. The first manipulates the coroutine machinery to backdoor a way into the promise. The second manipulates a property of co_yield to do basically the same thing.
Transform
The promise object for a coroutine is usually hidden and inaccessible from the coroutine. It is accessible to the future object, which the promise creates and which acts as an interface to the promised data. But it is also accessible during certain parts of the co_await machinery.
Specifically, when you perform a co_await on any expression in a coroutine, the machinery looks at your promise type to see if it has a function called await_transform. If so, it will call that promise object's await_transform on every expression you co_await on (at least, in a co_await that you directly write, not implicit awaits, such as the one created by co_yield).
As such, we need to do two things: create an overload of await_transform on the promise type, and create a type whose sole purpose is to allow us to call that await_transform function.
So that would look something like this:
struct generator_input {};
...
//Within the promise type:
auto await_transform(generator_input);
One quick note. The downside of using await_transform like this is that, by specifying even one overload of this function for our promise, we impact every co_await in any coroutine that uses this type. For a generator coroutine, that's not very important, since there's not much reason to co_await unless you're doing a hack like this. But if you were creating a more general mechanism that could distinctly await on arbitrary awaitables as part of its generation, you'd have a problem.
OK, so we have this await_transform function; what does this function need to do? It needs to return an awaitable object, since co_await is going to await on it. But the purpose of this awaitable object is to deliver a reference to the input type. Fortunately, the mechanism co_await uses to convert the awaitable into a value is provided by the awaitable's await_resume method. So ours can just return an InputType&:
//Within the `generator<OutputType, InputType>`:
struct passthru_value
{
InputType &ret_;
bool await_ready() {return true;}
void await_suspend(coro_handle) {}
InputType &await_resume() { return ret_; }
};
//Within the promise type:
auto await_transform(generator_input)
{
return passthru_value{input_value}; //Where `input_value` is the `InputType` object stored by the promise.
}
This gives the coroutine access to the value, by invoking co_await generator_input{};. Note that this returns a reference to the object.
The generator type can easily be modified to allow the ability to modify an InputType object stored in the promise. Simply add a pair of send functions for overwriting the input value:
void send(const InputType &input)
{
coro.promise().input_value = input;
}
void send(InputType &&input)
{
coro.promise().input_value = std::move(input);
}
This represents an asymmetric transport mechanism. The coroutine retrieves a value at a place and time of its own choosing. As such, it is under no real obligation to respond instantly to any changes. This is good in some respects, as it allows a coroutine to insulate itself from deleterious changes. If you're using a range-based for loop over a container, that container cannot be directly modified (in most ways) by the outside world or else your program will exhibit UB. So if the coroutine is fragile in that way, it can copy the data from the user and thus prevent the user from modifying it.
All in all, the needed code isn't that large. Here's a run-able example of your code with these modifications:
#include <coroutine>
#include <exception>
#include <string>
#include <iostream>
struct generator_input {};
template <typename OutputType, typename InputType>
struct generator {
struct promise_type;
using coro_handle = std::coroutine_handle<promise_type>;
struct passthru_value
{
InputType &ret_;
bool await_ready() {return true;}
void await_suspend(coro_handle) {}
InputType &await_resume() { return ret_; }
};
struct promise_type {
OutputType current_value;
InputType input_value;
auto get_return_object() { return generator{coro_handle::from_promise(*this)}; }
auto initial_suspend() { return std::suspend_always{}; }
auto final_suspend() { return std::suspend_always{}; }
void unhandled_exception() { std::terminate(); }
auto yield_value(OutputType value) {
current_value = value;
return std::suspend_always{};
}
void return_void() {}
auto await_transform(generator_input)
{
return passthru_value{input_value};
}
};
bool next() { return coro ? (coro.resume(), !coro.done()) : false; }
OutputType value() { return coro.promise().current_value; }
void send(const InputType &input)
{
coro.promise().input_value = input;
}
void send(InputType &&input)
{
coro.promise().input_value = std::move(input);
}
generator(generator const & rhs) = delete;
generator(generator &&rhs)
:coro(rhs.coro)
{
rhs.coro = nullptr;
}
~generator() {
if (coro)
coro.destroy();
}
private:
generator(coro_handle h) : coro(h) {}
coro_handle coro;
};
generator<char, std::string> hello(){
auto word = co_await generator_input{};
for(auto &ch: word){
co_yield ch;
}
}
int main(int, char**)
{
auto test = hello();
test.send("hello world");
while(test.next())
{
std::cout << test.value() << ' ';
}
}
Be more yielding
An alternative to using an explicit co_await is to exploit a property of co_yield. Namely, co_yield is an expression and therefore it has a value. Specifically, it is (mostly) equivalent to co_await p.yield_value(e), where p is the promise object (ohh!) and e is what we're yielding.
Fortunately, we already have a yield_value function; it returns std::suspend_always. But it could also return an object that always suspends, but also which co_await can unpack into an InputType&:
struct yield_thru
{
InputType &ret_;
bool await_ready() {return false;}
void await_suspend(coro_handle) {}
InputType &await_resume() { return ret_; }
};
...
//in the promise
auto yield_value(OutputType value) {
current_value = value;
return yield_thru{input_value};
}
This is a symmetric transport mechanism; for every value you yield, you receive a value (which may be the same one as before). Unlike the explicit co_await method, you can't receive a value before you start to generate them. This could be useful for certain interfaces.
And of course, you could combine them as you see fit.
Given a string of JSON data, how can I safely turn that string into a JavaScript object?
Obviously I can do this unsafely with something like:
var obj = eval("(" + json + ')');
but that leaves me vulnerable to the JSON string containing other code, which it seems very dangerous to simply eval.
JSON.parse(jsonString) is a pure JavaScript approach so long as you can guarantee a reasonably modern browser.
The jQuery method is now deprecated. Use this method instead:
let jsonObject = JSON.parse(jsonString);
Original answer using deprecated jQuery functionality:
If you're using jQuery just use:
jQuery.parseJSON( jsonString );
It's exactly what you're looking for (see the jQuery documentation).
This answer is for IE < 7, for modern browsers check Jonathan's answer above.
This answer is outdated and Jonathan's answer above (JSON.parse(jsonString)) is now the best answer.
JSON.org has JSON parsers for many languages including four different ones for JavaScript. I believe most people would consider json2.js their goto implementation.
Use the simple code example in "JSON.parse()":
var jsontext = '{"firstname":"Jesper","surname":"Aaberg","phone":["555-0100","555-0120"]}';
var contact = JSON.parse(jsontext);
and reversing it:
var str = JSON.stringify(arr);
This seems to be the issue:
An input that is received via Ajax websocket etc, and it will be in String format, but you need to know if it is JSON.parsable. The touble is, if you always run it through JSON.parse, the program MAY continue "successfully" but you'll still see an error thrown in the console with the dreaded "Error: unexpected token 'x'".
var data;
try {
data = JSON.parse(jqxhr.responseText);
} catch (_error) {}
data || (data = {
message: 'Server error, please retry'
});
I'm not sure about other ways to do it but here's how you do it in Prototype (JSON tutorial).
new Ajax.Request('/some_url', {
method:'get',
requestHeaders: {Accept: 'application/json'},
onSuccess: function(transport){
var json = transport.responseText.evalJSON(true);
}
});
Calling evalJSON() with true as the argument sanitizes the incoming string.
If you're using jQuery, you can also use:
$.getJSON(url, function(data) { });
Then you can do things like
data.key1.something
data.key1.something_else
etc.
Just for fun, here is a way using a function:
jsonObject = (new Function('return ' + jsonFormatData))()
$.ajax({
url: url,
dataType: 'json',
data: data,
success: callback
});
The callback is passed the returned data, which will be a JavaScript object or array as defined by the JSON structure and parsed using the $.parseJSON() method.
Using JSON.parse is probably the best way.
Here's an example
var jsonRes = '{ "students" : [' +
'{ "firstName":"Michel" , "lastName":"John" ,"age":18},' +
'{ "firstName":"Richard" , "lastName":"Joe","age":20 },' +
'{ "firstName":"James" , "lastName":"Henry","age":15 } ]}';
var studentObject = JSON.parse(jsonRes);
The easiest way using parse() method:
var response = '{"result":true,"count":1}';
var JsonObject= JSON.parse(response);
Then you can get the values of the JSON elements, for example:
var myResponseResult = JsonObject.result;
var myResponseCount = JsonObject.count;
Using jQuery as described in the jQuery.parseJSON() documentation:
JSON.parse(jsonString);
Try using the method with this Data object. ex:Data='{result:true,count:1}'
try {
eval('var obj=' + Data);
console.log(obj.count);
}
catch(e) {
console.log(e.message);
}
This method really helps in Nodejs when you are working with serial port programming
I found a "better" way:
In CoffeeScript:
try data = JSON.parse(jqxhr.responseText)
data ||= { message: 'Server error, please retry' }
In Javascript:
var data;
try {
data = JSON.parse(jqxhr.responseText);
} catch (_error) {}
data || (data = {
message: 'Server error, please retry'
});
JSON parsing is always a pain. If the input is not as expected it throws an error and crashes what you are doing.
You can use the following tiny function to safely parse your input. It always turns an object even if the input is not valid or is already an object which is better for most cases:
JSON.safeParse = function (input, def) {
// Convert null to empty object
if (!input) {
return def || {};
} else if (Object.prototype.toString.call(input) === '[object Object]') {
return input;
}
try {
return JSON.parse(input);
} catch (e) {
return def || {};
}
};
Parse the JSON string with JSON.parse(), and the data becomes a JavaScript object:
JSON.parse(jsonString)
Here, JSON represents to process JSON dataset.
Imagine we received this text from a web server:
'{ "name":"John", "age":30, "city":"New York"}'
To parse into a JSON object:
var obj = JSON.parse('{ "name":"John", "age":30, "city":"New York"}');
Here obj is the respective JSON object which looks like:
{ "name":"John", "age":30, "city":"New York"}
To fetch a value use the . operator:
obj.name // John
obj.age //30
Convert a JavaScript object into a string with JSON.stringify().
JSON.parse(jsonString);
json.parse will change into object.
JSON.parse() converts any JSON string passed into the function into a JSON object.
To understand it better, press F12 to open "Inspect Element" in your browser and go to the console to write the following commands:
var response = '{"result":true,"count":1}'; //sample json object(string form)
JSON.parse(response); //converts passed string to JSON Object.
Now run the command:
console.log(JSON.parse(response));
You'll get output as an Object {result: true, count: 1}.
In order to use that Object, you can assign it to the variable, maybe obj:
var obj = JSON.parse(response);
By using obj and the dot (.) operator you can access properties of the JSON object.
Try to run the command:
console.log(obj.result);
Official documentation:
The JSON.parse() method parses a JSON string, constructing the JavaScript value or object described by the string. An optional reviver function can be provided to perform a transformation on the resulting object before it is returned.
Syntax:
JSON.parse(text[, reviver])
Parameters:
text
: The string to parse as JSON. See the JSON object for a description of JSON syntax.
reviver (optional)
: If a function, this prescribes how the value originally produced by parsing is transformed, before being returned.
Return value
The Object corresponding to the given JSON text.
Exceptions
Throws a SyntaxError exception if the string to parse is not valid JSON.
If we have a string like this:
"{\"status\":1,\"token\":\"65b4352b2dfc4957a09add0ce5714059\"}"
then we can simply use JSON.parse twice to convert this string to a JSON object:
var sampleString = "{\"status\":1,\"token\":\"65b4352b2dfc4957a09add0ce5714059\"}"
var jsonString= JSON.parse(sampleString)
var jsonObject= JSON.parse(jsonString)
And we can extract values from the JSON object using:
// instead of last JSON.parse:
var { status, token } = JSON.parse(jsonString);
The result will be:
status = 1 and token = 65b4352b2dfc4957a09add0ce5714059
Performance
There are already good answer for this question, but I was curious about performance and today 2020.09.21 I conduct tests on MacOs HighSierra 10.13.6 on Chrome v85, Safari v13.1.2 and Firefox v80 for chosen solutions.
Results
eval/Function (A,B,C) approach is fast on Chrome (but for big-deep object N=1000 they crash: "maximum stack call exceed)
eval (A) is fast/medium fast on all browsers
JSON.parse (D,E) are fastest on Safari and Firefox
Details
I perform 4 tests cases:
for small shallow object HERE
for small deep object HERE
for big shallow object HERE
for big deep object HERE
Object used in above tests came from HERE
let obj_ShallowSmall = {
field0: false,
field1: true,
field2: 1,
field3: 0,
field4: null,
field5: [],
field6: {},
field7: "text7",
field8: "text8",
}
let obj_DeepSmall = {
level0: {
level1: {
level2: {
level3: {
level4: {
level5: {
level6: {
level7: {
level8: {
level9: [[[[[[[[[['abc']]]]]]]]]],
}}}}}}}}},
};
let obj_ShallowBig = Array(1000).fill(0).reduce((a,c,i) => (a['field'+i]=getField(i),a) ,{});
let obj_DeepBig = genDeepObject(1000);
// ------------------
// Show objects
// ------------------
console.log('obj_ShallowSmall:',JSON.stringify(obj_ShallowSmall));
console.log('obj_DeepSmall:',JSON.stringify(obj_DeepSmall));
console.log('obj_ShallowBig:',JSON.stringify(obj_ShallowBig));
console.log('obj_DeepBig:',JSON.stringify(obj_DeepBig));
// ------------------
// HELPERS
// ------------------
function getField(k) {
let i=k%10;
if(i==0) return false;
if(i==1) return true;
if(i==2) return k;
if(i==3) return 0;
if(i==4) return null;
if(i==5) return [];
if(i==6) return {};
if(i>=7) return "text"+k;
}
function genDeepObject(N) {
// generate: {level0:{level1:{...levelN: {end:[[[...N-times...['abc']...]]] }}}...}}}
let obj={};
let o=obj;
let arr = [];
let a=arr;
for(let i=0; i<N; i++) {
o['level'+i]={};
o=o['level'+i];
let aa=[];
a.push(aa);
a=aa;
}
a[0]='abc';
o['end']=arr;
return obj;
}
Below snippet presents chosen solutions
// src: https://stackoverflow.com/q/45015/860099
function A(json) {
return eval("(" + json + ')');
}
// https://stackoverflow.com/a/26377600/860099
function B(json) {
return (new Function('return ('+json+')'))()
}
// improved https://stackoverflow.com/a/26377600/860099
function C(json) {
return Function('return ('+json+')')()
}
// src: https://stackoverflow.com/a/5686237/860099
function D(json) {
return JSON.parse(json);
}
// src: https://stackoverflow.com/a/233630/860099
function E(json) {
return $.parseJSON(json)
}
// --------------------
// TEST
// --------------------
let json = '{"a":"abc","b":"123","d":[1,2,3],"e":{"a":1,"b":2,"c":3}}';
[A,B,C,D,E].map(f=> {
console.log(
f.name + ' ' + JSON.stringify(f(json))
)})
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
This shippet only presents functions used in performance tests - it not perform tests itself!
And here are example results for chrome
Converting the object to JSON, and then parsing it, works for me, like:
JSON.parse(JSON.stringify(object))
The recommended approach to parse JSON in JavaScript is to use JSON.parse()
Background
The JSON API was introduced with ECMAScript 5 and has since been implemented in >99% of browsers by market share.
jQuery once had a $.parseJSON() function, but it was deprecated with jQuery 3.0. In any case, for a long time, it was nothing more than a wrapper around JSON.parse().
Example
const json = '{ "city": "Boston", "population": 500000 }';
const object = JSON.parse(json);
console.log(object.city, object.population);
Browser Compatibility
Is JSON.parse supported by all major browsers?
Pretty much, yes (see reference).
Older question, I know, however nobody notice this solution by using new Function(), an anonymous function that returns the data.
Just an example:
var oData = 'test1:"This is my object",test2:"This is my object"';
if( typeof oData !== 'object' )
try {
oData = (new Function('return {'+oData+'};'))();
}
catch(e) { oData=false; }
if( typeof oData !== 'object' )
{ alert( 'Error in code' ); }
else {
alert( oData.test1 );
alert( oData.test2 );
}
This is a little more safe because it executes inside a function and do not compile in your code directly. So if there is a function declaration inside it, it will not be bound to the default window object.
I use this to 'compile' configuration settings of DOM elements (for example the data attribute) simple and fast.
Summary:
Javascript (both browser and NodeJS) have a built in JSON object. On this Object are 2 convenient methods for dealing with JSON. They are the following:
JSON.parse() Takes JSON as argument, returns JS object
JSON.stringify() Takes JS object as argument returns JSON object
Other applications:
Besides for very conveniently dealing with JSON they have can be used for other means. The combination of both JSON methods allows us to make very easy make deep clones of arrays or objects. For example:
let arr1 = [1, 2, [3 ,4]];
let newArr = arr1.slice();
arr1[2][0] = 'changed';
console.log(newArr); // not a deep clone
let arr2 = [1, 2, [3 ,4]];
let newArrDeepclone = JSON.parse(JSON.stringify(arr2));
arr2[2][0] = 'changed';
console.log(newArrDeepclone); // A deep clone, values unchanged
You also can use reviver function to filter.
var data = JSON.parse(jsonString, function reviver(key, value) {
//your code here to filter
});
For more information read JSON.parse.
Just to the cover parse for different input types
Parse the data with JSON.parse(), and the data becomes a JavaScript object.
var obj = JSON.parse('{ "name":"John", "age":30, "city":"New York"}');
When using the JSON.parse() on a JSON derived from an array, the method will return a JavaScript array, instead of a JavaScript object.
var myArr = JSON.parse(this.responseText);
console.log(myArr[0]);
Date objects are not allowed in JSON.
For Dates do somthing like this
var text = '{ "name":"John", "birth":"1986-12-14", "city":"New York"}';
var obj = JSON.parse(text);
obj.birth = new Date(obj.birth);
Functions are not allowed in JSON.
If you need to include a function, write it as a string.
var text = '{ "name":"John", "age":"function () {return 30;}", "city":"New York"}';
var obj = JSON.parse(text);
obj.age = eval("(" + obj.age + ")");
Another option
const json = '{ "fruit": "pineapple", "fingers": 10 }'
let j0s,j1s,j2s,j3s
console.log(`{ "${j0s="fruit"}": "${j1s="pineapple"}", "${j2s="fingers"}": ${j3s="10"} }`)
Try this. This one is written in typescript.
export function safeJsonParse(str: string) {
try {
return JSON.parse(str);
} catch (e) {
return str;
}
}
The API doc is here:http://kafka-python.readthedocs.org/en/latest/apidoc/kafka.consumer.html
But when I run the following code, the exception is %d format: a number is required, not NoneType
client = KafkaClient("localhost:9092")
consumer = SimpleConsumer(client, "test-group", "test")
consumer.seek(0, whence=None)# (0,2) and (0,0)
run = True
while( run ):
message = consumer.get_message(block=False, timeout=4000)
except Exception as e:
print "Exception while trying to read msg:", str(e)
When I used the following piece of code, the exception is seek() got an unexpected keyword argument 'partition'
consumer.seek(0, whence=None, partition=None)# (0,2) and (0,0)
Any idea? Thanks.
In the Kafka Definitive Guide, there is a sample code of seek() written in Java (not in Python, but I hope you might get the general idea).
public class SaveOffsetsOnRebalance implements ConsumerRebalanceListener {
public void onPartitionsRevoked (Collection <TopicPartition> partitions) {
commitDBTransaction();
}
public void onPartitionsAssigned(Collection <TopicPartiton> partitions) {
for(TopicPartition partition : partitions)
consumer.seek(partition, getOffsetFromDB(partition));
}
}
} // these brackets are exactly the same as the book. I didn't change anything. You might want to though.
consumer.subscribe (topics, new SaveOffsetOnRebalance(consumer));
consumer.poll(0);
for ( TopicPartition partition : consumer.assignment())
consumer.seek(partition, getOffsetFromDB(partition));
while (true) {
ConsumerRecords <String, String> records = consumer.poll(100);
for (ConsumerRecord <String, String> record : records)
{
processRecord(record);
storeRecordInDB(record);
storeOffsetInDB(record.topic(), record.partition(), record.offset());
}
commitDBTransaction();
}