Is it possible to send a POST CORS request with json data?

Is it possible to send a custom POST CORS request with json data?

I found that the website is vulnerable to CORS and it’s accepting my origin header:

, however the request is a POST one and if i try without any post data i get: {"errorCode":"invalid","message":"Invalid json body","statusCode":400}

I was wondering if it’s possible to send cors requests containing json data. If it’s possible how should i edit my proof of concept code?

At the moment i’m using the following:

<script> var createCORSRequest = function(method, url) {   var xhr = new XMLHttpRequest();   if ("withCredentials" in xhr) {     // Most browsers., url, true);   } else if (typeof XDomainRequest != "undefined") {     // IE8 & IE9     xhr = new XDomainRequest();, url);   } else {     // CORS not supported.     xhr = null;   }   return xhr; };  var url = ''; var method = 'POST'; var xhr = createCORSRequest(method, url);  xhr.onload = function() {   // Success code goes here. };  xhr.onerror = function() {   // Error code goes here. };  xhr.withCredentials = true; xhr.send(); </script> 

But i’ll need to add {"id":"test","name":"test"} as POST json data to my PoC to make it work. How could i do that?

Null byte injection using JSON

I’m trying to make a chatroom for my university, It takes username in JSON, and then stores it in an array, then takes it to DB for keeping logs, but the thing is, that array also has a "status" key, whose value is set to guest my default, but is set to ADMIN if I log in or any member from my team logs in. I know that the idea of storing "status" with username is bad but I just started working on the project. I want to confirm that is it possible to inject NULL byte using username field via JSON and add another key with same name "status" to gain admin privileges??

benefits of storing columns in JSON instead of traditional tables?

Are there any benefits in using JSON over traditional table structures?

Imagine having a table structure like this:

 create table table1 (         t_id int,         first_name varchar(20),         last_name varchar(20),         age int     ) 

What if you stored the same columns inside a json like this:

{     "first_name":"name",     "last_name":"name",     "age":2 } 

and have a table like this:

create table table2 (     t_id int,     attribute jsonb ) 

Correct me if im wrong, but since both variants are causing a row to be completely rewritten if there have been any updates or deletes on that row, then both variants are identical in that regard.

How do i produce keypairs for my users while implementing json web token

I want to check the integrity of my user information; that the information which my website server receives was indeed sent by them. From what I understand, json web tokens (jwt) is the way to go.

I want to use asymmetric keys for the signing. I know that key pairs can be generated using these commands:

openssl genrsa -out private.pem 2048 openssl rsa -in private.pem -outform PEM -pubout -out public.pem 

But how do I produce key pairs from javascript code for my users when they sign in. And how do I store the private key at the user end and the public key at the server end ?

AES encryption (in Java) of different JSON strings always produce same encrypted string as result. Why?

I have a program written in Java which takes JSON string as argument, encrypts it using AES then encodes it using Base64. JSON string is like:

{"a": "b"} or {"a": "n"} or {"a": "k"}  

I.e related object would have one property a. Value part is randomly generated.

Program outputs for above JSON inputs looks like

UBNvKoRoGqk0PTQQL5K4Sw== bKwlToSND3HkceDExEDXSw== u/yKJq1FdoifBM+AnadC3A== 

i.e. they are unique.

Same goes for {"a":"gn"} — random string with length 2. Same for 3 and so on.

But starting from 7 program produces the same encoded string for different inputs. I mean following JSON strings taken as input:

{"a": "pzfovvs"} {"a": "bqwuvck"} 

produces same string as output:


Same goes for length 8 and 9. Starting from 10 results became unique again.

What is the explanation of this strange phenomenon?

(I can post code if needed.)

Ok, here is the code:

import; import; import java.util.Base64; import javax.crypto.Cipher; import javax.crypto.KeyGenerator;  public class JWTEncryptor {  private static String algorithm = "AES"; private static Key key; private static KeyGenerator keyGenerator; private static Cipher cipher;  public static String encrypt(String jwt) throws Exception {     if (key == null || cipher == null) {         setUp();     }     cipher.init(Cipher.ENCRYPT_MODE, key);     return Base64.getEncoder().encodeToString(cipher.doFinal(jwt.getBytes("UTF-8"))); }  private static void setUp() {     try {         cipher = Cipher.getInstance(algorithm);     } catch (Exception e1) {         e1.printStackTrace();     }     if (keyGenerator != null) {         key = keyGenerator.generateKey();         return;     }     try {         keyGenerator = KeyGenerator.getInstance(algorithm);         key = keyGenerator.generateKey();     } catch (NoSuchAlgorithmException e) {         e.printStackTrace();     } }  public static String decrypt(String encryptedJWT) throws Exception {     cipher.init(Cipher.DECRYPT_MODE, key);     return new     String(cipher.doFinal(Base64.getDecoder().decode(encryptedJWT))); }   } 

How to store and retrieve custom metabox data as json in wordpress api

I am trying to store metabox data as json in wordpress and then get it using wp-json api.

What i tried

$  current = array(   'id' => "in json object",   'name' => 'name in json object' ); $  serializedArray = unserialize($  current); update_post_meta($  post_id, 'ha_basic_information', serialize($  current)); 

What i am trying to achieve {"id": "in json object", "name": "name in json object"}

What i am getting "s:75:\"a:2:{s:2:\"id\";s:14:\"in json object\";s:4:\"name\";s:19:\"name in json object\";}\";"

I want to get data as object in wp-json api. I’ve tried serialize/unserialize, json_encode/json_decode but nothing seems to work. I am new in php and can’t figure it out. there are many questions related to this but nothing seems to work.

Why do modern sites frequently use JSON blobs client-side and construct the webpage in JavaScript client-side? [closed]

I have noticed something in later years. Instead of actually creating a HTML page, such as a table of data, on the server and sending the finished HTML page, they nowadays send a “minimal” (in a very broad sense of that word…) webpage which executes a JavaScript which in turn loads in JSON blobs which it then parses in JavaScript, client-side, to construct a webpage which is finally displayed to the user.

One side-effect of this, intentional or not, is that it often makes it much easier for me to “grab” their data since they frequently just “dump” their internal database’s fields to the client, even if they themselves don’t use all fields. So in a way, it’s like they are making it easier for people like me to automate things on their websites, whereas I used to have to constantly parse complex, messy, ever-changing HTML code.

So while I hate how idiotic this is from a logical/user perspective, as well as from a security one (depending on various factors), it’s actually “good” for me in a way. I just don’t understand it, since it’s significantly more work on their part and an overall extremely strange way of doing things.

Whenever I notice that a HTML page is “empty from content”, I always open the network tab in Pale Moon and reload the page, and then I see a “JSON” blob which I can study. It’s bizarre. It’s almost as if they are unofficially providing an “API” without mentioning it openly, but secretly they wink/nudge to us “powerusers”?

Compress JSON String Stored in PostgreSQL, such as MessagePack?

JSON strings are currently being stored in a PostgreSQL 11 table in a text field. For example, a row can have the text field asks containing the string:


Question: Is it possible to store it in a format that consumes lesser space? Using some CPU cycles to serialize/deserialize the JSON string is acceptable as a compromise for using lesser storage space. The JSON data does not need to be searchable. The JSON object keys are almost always different in different rows.

I am particularly interested in using JSON encoding/compression algorithms like MessagePack with zlib, but how can we use this when inserting the record into the PostgreSQL table?

Note: Also using Node.js with Knex.js to communicate with PostgreSQL. Currently converting JSON objects to strings using Node’s JSON.stringify function.

Json hijacking does not work

I have patched my site to prevent JSON hijacking. During this process, I was interested to see if I could actually exploit this vulnerability.

So I created a foo.html, added a script tag which source attribute referenced my site which I was logged into. I was unable to exploit the vulnerability. I took a look at the network traffic, and I could not see my authentication cookie being passed in the request.

Does this mean that most browsers have fixed the vulnerability? Is there some table that will let me know which browsers have fixed it? Or have I completely misunderstood the vulnerability?

Evolving conversions between trees (representing JSON) and graphs (representing relational database schemas)

I have in input different, say 100, types of trees with labeled nodes (representing JSON files). I need to transform the information contained in the trees into graphs with labeled nodes (representing insert statements into tables in a relational database).

The structure of the trees evolves very quickly. Every month some labels are moved, renamed, or their type changes.

Let’s say that I know how to transform trees into graphs at time t, is is possible to infer somehow at least part of the new transformations with the new tree structure?

Are there papers or books that I need to read to know some theory that could help me to tackle this task?