Recently, the JSON API standard for developing web services is gaining popularity. In my opinion, this is a very good solution that finally standardizes the API development process at least a little bit, and instead of the next invention of the bicycle, we will use libraries both on the server side and the client to exchange data, focusing on interesting tasks instead of writing serializers and parsers first time.
I really like the JSON API, as it immediately provides the data in a normalized form, preserving the hierarchy, and also supports pagination, sorting and filtering out of the box.
{ "id": "123", "author": { "id": "1", "name": "Paul" }, "title": "My awesome blog post", "comments": [ { "id": "324", "text": "Great job, Bro!", "commenter": { "id": "2", "name": "Nicole" } } ] }
{ "data": [{ "type": "post", "id": "123", "attributes": { "id": 123, "title": "My awesome blog post" }, "relationships": { "author": { "type": "user", "id": "1" }, "comments": { "type": "comment", "id": "324" } } }], "included": [{ "type": "user", "id": "1", "attributes": { "id": 1, "name": "Paul" } }, { "type": "user", "id": "2", "attributes": { "id": 2, "name": "Nicole" } }, { "type": "comment", "id": "324", "attributes": { "id": 324, "text": "Great job, Bro!" }, "relationships": { "commenter": { "type": "user", "id": "2" } } }] }
The main disadvantage of the JSON API when compared with traditional APIs is its “talkativeness”, but is it really so bad?
Type of | Before compression (bytes) | After compression (bytes) |
---|---|---|
Traditional JSON | 264 | 170 |
JSON API | 771 | 293 |
After gzip, the difference in size becomes much smaller, and since we are talking about structured data of a small amount, in terms of performance, everything will be fine too.
If you wish, you can come up with a synthetic test, where the size of the data in the JSON API view will be less than in traditional JSON: take a bunch of objects that refer to another object, for example, blog posts and their author, and then the author object will appear in the JSON API only once, while in traditional JSON it will be included for each post.
Now about the merits: the data structure returned by the JSON API will always be flat and normalized, that is, each object will have no more than one level of nesting. Such a presentation not only avoids duplication of objects, but also perfectly corresponds to the best practices for working with data in redux. Finally, object type typing is built into the JSON API, so on the client side it is not necessary to define "schemes", as normalizr requires. This feature allows you to simplify work with data on the client, as we will soon be able to see.
Note: here and hereafter redux can be replaced by many other state management libraries, but according to the latest State of JavaScript in 2016 survey, redux is far more popular than any other existing solution, so redux and state management in JS for me is almost the same same
The JSON API out of the box is quite good for integration with redux, however, there are a few things that can be done better.
In particular, for an application, the separation of data into data
and included
may make sense, because sometimes it is necessary to separate which data we asked for and which ones we received "into the bargain." However, store data in the store should be uniform, otherwise we risk having multiple copies of the same objects in different places, which is contrary to the best practices of redux.
Also, the JSON API returns us a collection of objects as an array, and in redux it is much more convenient to work with them as with a Map.
To solve these problems, I developed a json-api-normalizer library that can do the following:
data
and included
;id =>
;meta
object;Let us dwell a little more on points 3 and 4.
Redux, as a rule, accumulates data in the store incrementally, which improves performance and simplifies the implementation of offline mode. However, if we work with the same data objects, it is not always possible to say unequivocally which data should be taken from the store for a particular screen. For each request, the json-api-normalizer stores in the special meta
object the structure of the JSON API document, which allows us to uniquely receive only the data from the store that we need.
json-api-normalizer converts a description of relationships
{ "relationships": { "comments": [{ "type": "comment", "id": "1", }, { "type": "comment", "id": "2", }, { "type": "comment", "id": "3", }] } }
in the next view
{ "relationships": { "comments": { "type": "comment", "id": "1,2,3" } } }
Such a representation is more convenient when updating the redux state through merge, since in this case it is not necessary to solve the complex problem of deleting one of the collection objects and references to it: in the merge process, we replace one line with the "id" with another, and the problem will be solved in one step. Probably, this solution will not be optimal for all scenarios, so I will be happy to pull requests, which will allow you to redefine the existing implementation using options.
As a source of JSON API documents, I wrote a simple web application on the Phoenix Framework . I will not dwell on its implementation, but I recommend looking at the source code to see how easy it is to make such web services.
As a client, I wrote a small application on React .
We will work with this blank. Make a git clone of this thread.
git clone https://github.com/yury-dymov/json-api-react-redux-example.git --branch initial
And you will have:
All this is configured and works out of the box.
To run the example, type in the console
npm run webpack-dev-server
and open http://localhost:8050
in the browser.
First we write redux middleware, which will interact with the API. This is where it is logical to use json-api-normalizer to avoid doing data normalization in many redux actions and repeating the same code.
import fetch from 'isomorphic-fetch'; import normalize from 'json-api-normalizer'; const API_ROOT = 'https://phoenix-json-api-example.herokuapp.com/api'; export const API_DATA_REQUEST = 'API_DATA_REQUEST'; export const API_DATA_SUCCESS = 'API_DATA_SUCCESS'; export const API_DATA_FAILURE = 'API_DATA_FAILURE'; function callApi(endpoint, options = {}) { const fullUrl = (endpoint.indexOf(API_ROOT) === -1) ? API_ROOT + endpoint : endpoint; return fetch(fullUrl, options) .then(response => response.json() .then((json) => { if (!response.ok) { return Promise.reject(json); } return Object.assign({}, normalize(json, { endpoint })); }), ); } export const CALL_API = Symbol('Call API'); export default function (store) { return function nxt(next) { return function call(action) { const callAPI = action[CALL_API]; if (typeof callAPI === 'undefined') { return next(action); } let { endpoint } = callAPI; const { options } = callAPI; if (typeof endpoint === 'function') { endpoint = endpoint(store.getState()); } if (typeof endpoint !== 'string') { throw new Error('Specify a string endpoint URL.'); } const actionWith = (data) => { const finalAction = Object.assign({}, action, data); delete finalAction[CALL_API]; return finalAction; }; next(actionWith({ type: API_DATA_REQUEST, endpoint })); return callApi(endpoint, options || {}) .then( response => next(actionWith({ response, type: API_DATA_SUCCESS, endpoint })), error => next(actionWith({ type: API_DATA_FAILURE, error: error.message || 'Something bad happened' })), ); }; }; }
This is where all the “magic” happens: after receiving the data in the middleware, we transform it with the help of the json-api-normalizer and transfer it further down the chain.
Note: if the error handler is slightly “doped”, then this code will fit perfectly for production.
Add middleware to the store configuration:
... +++ import api from './middleware/api'; export default function (initialState = {}) { const store = createStore(rootReducer, initialState, compose( --- applyMiddleware(thunk), +++ applyMiddleware(thunk, api), DevTools.instrument(), ...
Now create the first action:
import { CALL_API } from '../middleware/api'; export function test() { return { [CALL_API]: { endpoint: '/test', }, }; }
Write a reducer:
import merge from 'lodash/merge'; import { API_DATA_REQUEST, API_DATA_SUCCESS } from '../middleware/api'; const initialState = { meta: {}, }; export default function (state = initialState, action) { switch (action.type) { case API_DATA_SUCCESS: return merge( {}, state, merge({}, action.response, { meta: { [action.endpoint]: { loading: false } } }), ); case API_DATA_REQUEST: return merge({}, state, { meta: { [action.endpoint]: { loading: true } } }); default: return state; } }
Add our reducer to the redux store configuration:
import { combineReducers } from 'redux'; import data from './data'; export default combineReducers({ data, });
Model layer is ready! Now you can associate business logic with UI.
import React, { PropTypes } from 'react'; import { connect } from 'react-redux'; import Button from 'react-bootstrap-button-loader'; import { test } from '../../redux/actions/test'; const propTypes = { dispatch: PropTypes.func.isRequired, loading: PropTypes.bool, }; function Content({ loading = false, dispatch }) { function fetchData() { dispatch(test()); } return ( <div> <Button loading={loading} onClick={() => { fetchData(); }}>Fetch Data from API</Button> </div> ); } Content.propTypes = propTypes; function mapStateToProps() { return {}; } export default connect(mapStateToProps)(Content);
Open the page in the browser and click on the button - thanks to Browser DevTools and Redux DevTools, you can see that our application receives data in the JSON API format, converts it into a more convenient presentation and stores it in the redux store. Fine! It is time to display this data in the UI.
The redux-object library turns data from the redux-store into a JavaScript object. To do this, it needs to pass the address of the redser, the type of the object and the id, and then she will do everything herself.
import build, { fetchFromMeta } from 'redux-object'; console.log(build(state.data, 'post', '1')); // ---> post console.log(fetchFromMeta(state.data, '/posts')); // ---> array of posts
All links will turn into a JavaScript property with lazy loading support, that is, the child object will be loaded only when it is needed.
const post = build(state.data, 'post', '1'); // ---> post object; `author` and `comments` properties are not loaded post.author; // ---> user object
Add some new UI components to display data on the page.
Note: I deliberately omit work with styles so as not to distract attention from the main topic of the article.
First we need to pull the data from the store and pass it to the components via the connect function:
import React, { PropTypes } from 'react'; import { connect } from 'react-redux'; import Button from 'react-bootstrap-button-loader'; import build from 'redux-object'; import { test } from '../../redux/actions/test'; import Question from '../Question'; const propTypes = { dispatch: PropTypes.func.isRequired, questions: PropTypes.array.isRequired, loading: PropTypes.bool, }; function Content({ loading = false, dispatch, questions }) { function fetchData() { dispatch(test()); } const qWidgets = questions.map(q => <Question key={q.id} question={q} />); return ( <div> <Button loading={loading} onClick={() => { fetchData(); }}>Fetch Data from API</Button> {qWidgets} </div> ); } Content.propTypes = propTypes; function mapStateToProps(state) { if (state.data.meta['/test']) { const questions = (state.data.meta['/test'].data || []).map(object => build(state.data, 'question', object.id)); const loading = state.data.meta['/test'].loading; return { questions, loading }; } return { questions: [] }; } export default connect(mapStateToProps)(Content);
Here we take the data from the query metadata '/ test', pull out the IDs and build on them objects of the type "Question", which we will pass to the component in the "questions" collection.
{ "name": "Question", "version": "0.0.0", "private": true, "main": "./Question" }
import React, { PropTypes } from 'react'; import Post from '../Post'; const propTypes = { question: PropTypes.object.isRequired, }; function Question({ question }) { const postWidgets = question.posts.map(post => <Post key={post.id} post={post} />); return ( <div className="question"> {question.text} {postWidgets} </div> ); } Question.propTypes = propTypes; export default Question;
We display questions and answers to them.
{ "name": "Post", "version": "0.0.0", "private": true, "main": "./Post" }
import React, { PropTypes } from 'react'; import Comment from '../Comment'; import User from '../User'; const propTypes = { post: PropTypes.object.isRequired, }; function Post({ post }) { const commentWidgets = post.comments.map(c => <Comment key={c.id} comment={c} />); return ( <div className="post"> <User user={post.author} /> {post.text} {commentWidgets} </div> ); } Post.propTypes = propTypes; export default Post;
Here we display the author of the answer and comments.
{ "name": "User", "version": "0.0.0", "private": true, "main": "./User" }
import React, { PropTypes } from 'react'; const propTypes = { user: PropTypes.object.isRequired, }; function User({ user }) { return <span className="user">{user.name}: </span>; } User.propTypes = propTypes; export default User;
{ "name": "Comment", "version": "0.0.0", "private": true, "main": "./Comment" }
import React, { PropTypes } from 'react'; import User from '../User'; const propTypes = { comment: PropTypes.object.isRequired, }; function Comment({ comment }) { return ( <div className="comment"> <User user={comment.author} /> {comment.text} </div> ); } Comment.propTypes = propTypes; export default Comment;
That's all! If something does not work, you can compare your code with the master branch of my project.
Live demo is available here.
The json-api-normalizer and redux-object libraries appeared recently. It may seem from the outside that they are quite simple, but, in fact, before coming to such an implementation, I managed to step on a lot of very different and unobvious rakes during the year and therefore I am sure that these simple and convenient tools can be useful. community and save a lot of time.
I invite you to take part in the discussion, as well as help me in the development of these tools.
Source: https://habr.com/ru/post/318958/
All Articles