Research
Security News
Quasar RAT Disguised as an npm Package for Detecting Vulnerabilities in Ethereum Smart Contracts
Socket researchers uncover a malicious npm package posing as a tool for detecting vulnerabilities in Etherium smart contracts.
@lighthouse/sdk
Advanced tools
SDK for communicating with the Lighthouse API, built with Redux.
import { getModule } from '@lighthouse/sdk'
const listId = 'all' // optional, defaults to 'default' in all module methods
const location = getModule('location')
@connect(
mapStateToProps,
mapDispatchToProps,
)
export default class MyComponent extends Component {
// build component...
}
// React Redux
function mapStateToProps(state) {
const listOpts = { /* sort options etc. */ }
const locationSelectors = location.selectors(state)
return {
// returns an array of location resources
locations: locationSelectors(listId).list(listOptions),
// returns the current location resrouce
currentLocation: locationSelectors.current(),
}
}
function mapDispatchToProps(dispatch) {
const { query, save, findById, remove } = location
return {
fetch: params => dispatch(query(listId, params)),
save: (params, payload, id) => dispatch(save(params, payload, id)),
findById: id => dispatch(findById(id)),
remove: id => dispatch(remove(id)),
}
}
You can use the optimistic
option to specify the new entities or updates to entities should be stored in the cache as soon as the request is sent. This improves the experience of the UX in some situations, resulting in a perceived performance increase and instant feedback.
It is particular useful for a chat interface where waiting for new messages to be persisted to the API hampers the experience (loading spinners). Whereas more often than not you can be sure that a send message request will be successful, so you can treat it as sent as soon as it is created and if it does fail, retry the request.
To make an optimistic update to save
requests, pass the optimistic
option to the params option, e.g:
// assuming we have setup messages module...
const params = {
optimistic: true
}
const payload = {
body: 'Hi Friend!'
}
// message will be available in cache as soon as request is made
message.save(params, payload)
You can follow an offline-first workflow using the optimistic option as above. Here's how the offline-first flow works:
id
value_id
) returned in the responserolled-back
so we can retry it if we want. We can identify these items in the cache via their state value of rolled-back
and (if it was a new document) the fact it doesn't have an _id
value in it's entity
dataThere's some things to keep in mind when working in an offline manner:
Most of our resources follow a RESTful CRUD pattern, so it's easy to add new modules to the sdk. To add a new CRUD module:
/modules
. The simplest way is to clone an existing CRUD module, e.g. 'zone' and update the references in the index file and the test./module/index.js
along with with the other modules/module/test/index.test.js
which ensures the correct modules are correctly exportedThis repo is designed to publish automatically to npm when a PR is merged into master. For this process to run smoothly you will need to do two things, set a GitHub label on the project of either minor
or patch
, and then merge the PR. The package.json
scripts should not be run manually as this can generate artifacts that prevent the automatic publishing process from running smoothly.
minor
if there's no breaking changes and you're adding a semver featurepatch
if there's no breaking changes and you're adding a semver bug fixmaster
You can replace a node_module dependency downloaded from npm with a local copy of that module. This can be done using yarn link
to create a symlink between the two repos that allows you to use the local copy instead. With this technique you can create a link from the web repo to this repo while you're developing. Then you can develop locally without having to publish multiple canary builds of this repo to npm. Once you've setup your link you can run a watcher in both the web and sdk projects so any changes are built automatically and you can debug live.
yarn link
yarn link @lighthouse/sdk
You should now be able to make changes in your local sdk project and they will be available in your web project immediately.
If you open two terminals and run yarn start
from the web repo and yarn dev
from the sdk repo then a change to sdk will trigger a build on save and when that build is finished it will automatically trigger a fresh build and deploy of web. You can use this to live debug in the browser locally.
Once you're done you can remove the symlink with the yarn unlink
command, it works in a similar fashion to creating a link.
yarn unlink
yarn unlink @lighthouse/sdk
You will now be back to using the version of ths sdk that you downloaded from npm
FAQs
Lighthouse.io SDK for JavaScript applications
The npm package @lighthouse/sdk receives a total of 109 weekly downloads. As such, @lighthouse/sdk popularity was classified as not popular.
We found that @lighthouse/sdk demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 0 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
Socket researchers uncover a malicious npm package posing as a tool for detecting vulnerabilities in Etherium smart contracts.
Security News
Research
A supply chain attack on Rspack's npm packages injected cryptomining malware, potentially impacting thousands of developers.
Research
Security News
Socket researchers discovered a malware campaign on npm delivering the Skuld infostealer via typosquatted packages, exposing sensitive data.