A ruby wrapper for the Readmill Developer API. Their API allows you to get all of your Books, Readings, Highlights, Comments, and more. This is a ruby wrapper which will allow you to interact with their API with ruby.
Lookout-Rake Lookout-Rake provides Rake¹ tasks for testing using Lookout. ¹ See http://rake.rubyforge.org/ § Installation Install Lookout-Rake with % gem install lookout-rake § Usage Include the following code in your ‹Rakefile›: require 'lookout-rake-3.0' Lookout::Rake::Tasks::Test.new If the ‹:default› task hasn’t been defined it’ll be set to depend on the ‹:test› task. The ‹:check› task will also depend on the ‹:test› task. There’s also a ‹:test:coverage› task that gets defined that uses the coverage library that comes with Ruby 1.9 to check the test coverage when the tests are run. You can hook up your test task to use your Inventory¹: load File.expand_path('../lib/library-X.0/version.rb', __FILE__) Lookout::Rake::Tasks::Test.new :inventory => Library::Version Also, if you use the tasks that come with Inventory-Rake², the test task will hook into the inventory you tell them to use automatically, that is, the following will do: load File.expand_path('../lib/library-X.0/version.rb', __FILE__) Inventory::Rake::Tasks.define Library::Version Lookout::Rake::Tasks::Test.new For further usage information, see the {API documentation}³. ¹ Inventory: http://disu.se/software/inventory/ ² Inventory-Rake: http://disu.se/software/inventory-rake/ ³ API: http://disu.se/software/lookout-rake/api/Lookout/Rake/Tasks/Test/ § Integration To use Lookout together with Vim¹, place ‹contrib/rakelookout.vim› in ‹~/.vim/compiler› and add compiler rakelookout to ‹~/.vim/after/ftplugin/ruby.vim›. Executing ‹:make› from inside Vim will now run your tests and an errors and failures can be visited with ‹:cnext›. Execute ‹:help quickfix› for additional information. Another useful addition to your ‹~/.vim/after/ftplugin/ruby.vim› file may be nnoremap <buffer> <silent> <Leader>M <Esc>:call <SID>run_test()<CR> let b:undo_ftplugin .= ' | nunmap <buffer> <Leader>M' function! s:run_test() let test = expand('%') let line = 'LINE=' . line('.') if test =~ '^lib/' let test = substitute(test, '^lib/', 'test/', '') let line = "" endif execute 'make' 'TEST=' . shellescape(test) line endfunction Now, pressing ‹<Leader>M› will either run all tests for a given class, if the implementation file is active, or run the test at or just before the cursor, if the test file is active. This is useful if you’re currently receiving a lot of errors and/or failures and want to focus on those associated with a specific class or on a specific test. ¹ Find out more about Vim at http://www.vim.org/ § Financing Currently, most of my time is spent at my day job and in my rather busy private life. Please motivate me to spend time on this piece of software by donating some of your money to this project. Yeah, I realize that requesting money to develop software is a bit, well, capitalistic of me. But please realize that I live in a capitalistic society and I need money to have other people give me the things that I need to continue living under the rules of said society. So, if you feel that this piece of software has helped you out enough to warrant a reward, please PayPal a donation to now@disu.se¹. Thanks! Your support won’t go unnoticed! ¹ Send a donation: https://www.paypal.com/cgi-bin/webscr?cmd=_donations&business=now%40disu%2ese&item_name=Nikolai%20Weibull%20Software%20Services § Reporting Bugs Please report any bugs that you encounter to the {issue tracker}¹. ¹ See https://github.com/now/lookout-rake/issues § Authors Nikolai Weibull wrote the code, the tests, the manual pages, and this README.
This gem is useful for development where a mock rest api server is required and with frank UI test automation.
A lightweight rails plugin that is very useful for developers.
A convenient API for developers to interact with SQS with a trivial amount of effort
The Notifiee gem allows Ruby developers to programmatically send notifications to team members (via multiple channels like Email, SMS, Telegram, Slack, Messenger, and Twitter DMs) through the Notifiee web service. The API is implemented as JSON over HTTP.
README ====== This is a simple API to evaluate information retrieval results. It allows you to load ranked and unranked query results and calculate various evaluation metrics (precision, recall, MAP, kappa) against a previously loaded gold standard. Start this program from the command line with: retreval -l <gold-standard-file> -q <query-results> -f <format> -o <output-prefix> The options are outlined when you pass no arguments and just call retreval You will find further information in the RDOC documentation and the HOWTO section below. If you want to see an example, use this command: retreval -l example/gold_standard.yml -q example/query_results.yml -f yaml -v INSTALLATION ============ If you have RubyGems, just run gem install retreval You can manually download the sources and build the Gem from there by `cd`ing to the folder where this README is saved and calling gem build retreval.gemspec This will create a gem file called which you just have to install with `gem install <file>` and you're done. HOWTO ===== This API supports the following evaluation tasks: - Loading a Gold Standard that takes a set of documents, queries and corresponding judgements of relevancy (i.e. "Is this document relevant for this query?") - Calculation of the _kappa measure_ for the given gold standard - Loading ranked or unranked query results for a certain query - Calculation of _precision_ and _recall_ for each result - Calculation of the _F-measure_ for weighing precision and recall - Calculation of _mean average precision_ for multiple query results - Calculation of the _11-point precision_ and _average precision_ for ranked query results - Printing of summary tables and results Typically, you will want to use this Gem either standalone or within another application's context. Standalone Usage ================ Call parameters --------------- After installing the Gem (see INSTALLATION), you can always call `retreval` from the commandline. The typical call is: retreval -l <gold-standard-file> -q <query-results> -f <format> -o <output-prefix> Where you have to define the following options: - `gold-standard-file` is a file in a specified format that includes all the judgements - `query-results` is a file in a specified format that includes all the query results in a single file - `format` is the format that the files will use (either "yaml" or "plain") - `output-prefix` is the prefix of output files that will be created Formats ------- Right now, we focus on the formats you can use to load data into the API. Currently, we support YAML files that must adhere to a special syntax. So, in order to load a gold standard, we need a file in the following format: * "query" denotes the query * "documents" these are the documents judged for this query * "id" the ID of the document (e.g. its filename, etc.) * "judgements" an array of judgements, each one with: * "relevant" a boolean value of the judgment (relevant or not) * "user" an optional identifier of the user Example file, with one query, two documents, and one judgement: - query: 12th air force germany 1957 documents: - id: g5701s.ict21311 judgements: [] - id: g5701s.ict21313 judgements: - relevant: false user: 2 So, when calling the program, specify the format as `yaml`. For the query results, a similar format is used. Note that it is necessary to specify whether the result sets are ranked or not, as this will heavily influence the calculations. You can specify the score for a document. By "score" we mean the score that your retrieval algorithm has given the document. But this is not necessary. The documents will always be ranked in the order of their appearance, regardless of their score. Thus in the following example, the document with "07" at the end is the first and "25" is the last, regardless of the score. --- query: 12th air force germany 1957 ranked: true documents: - score: 0.44034874 document: g5701s.ict21307 - score: 0.44034874 document: g5701s.ict21309 - score: 0.44034874 document: g5701s.ict21311 - score: 0.44034874 document: g5701s.ict21313 - score: 0.44034874 document: g5701s.ict21315 - score: 0.44034874 document: g5701s.ict21317 - score: 0.44034874 document: g5701s.ict21319 - score: 0.44034874 document: g5701s.ict21321 - score: 0.44034874 document: g5701s.ict21323 - score: 0.44034874 document: g5701s.ict21325 --- query: 1612 ranked: true documents: - score: 1.0174774 document: g3290.np000144 - score: 0.763108 document: g3201b.ct000726 - score: 0.763108 document: g3400.ct000886 - score: 0.6359234 document: g3201s.ct000130 --- **Note**: You can also use the `plain` format, which will load the gold standard in a different way (but not the results): my_query my_document_1 false my_query my_document_2 true See that every query/document/relevancy pair is separated by a tabulator? You can also add the user's ID in the fourth column if necessary. Running the evaluation ----------------------- After you have specified the input files and the format, you can run the program. If needed, the `-v` switch will turn on verbose messages, such as information on how many judgements, documents and users there are, but this shouldn't be necessary. The program will first load the gold standard and then calculate the statistics for each result set. The output files are automatically created and contain a YAML representation of the results. Calculations may take a while depending on the amount of judgements and documents. If there are a thousand judgements, always consider a few seconds for each result set. Interpreting the output files ------------------------------ Two output files will be created: - `output_avg_precision.yml` - `output_statistics.yml` The first lists the average precision for each query in the query result file. The second file lists all supported statistics for each query in the query results file. For example, for a ranked evaluation, the first two entries of such a query result statistic look like this: --- 12th air force germany 1957: - :precision: 0.0 :recall: 0.0 :false_negatives: 1 :false_positives: 1 :true_negatives: 2516 :true_positives: 0 :document: g5701s.ict21313 :relevant: false - :precision: 0.0 :recall: 0.0 :false_negatives: 1 :false_positives: 2 :true_negatives: 2515 :true_positives: 0 :document: g5701s.ict21317 :relevant: false You can see the precision and recall for that specific point and also the number of documents for the contingency table (true/false positives/negatives). Also, the document identifier is given. API Usage ========= Using this API in another ruby application is probably the more common use case. All you have to do is include the Gem in your Ruby or Ruby on Rails application. For details about available methods, please refer to the API documentation generated by RDoc. **Important**: For this implementation, we use the document ID, the query and the user ID as the primary keys for matching objects. This means that your documents and queries are identified by a string and thus the strings should be sanitized first. Loading the Gold Standard ------------------------- Once you have loaded the Gem, you will probably start by creating a new gold standard. gold_standard = GoldStandard.new Then, you can load judgements into this standard, either from a file, or manually: gold_standard.load_from_yaml_file "my-file.yml" gold_standard.add_judgement :document => doc_id, :query => query_string, :relevant => boolean, :user => John There is a nice shortcut for the `add_judgement` method. Both lines are essentially the same: gold_standard.add_judgement :document => doc_id, :query => query_string, :relevant => boolean, :user => John gold_standard << :document => doc_id, :query => query_string, :relevant => boolean, :user => John Note the usage of typical Rails hashes for better readability (also, this Gem was developed to be used in a Rails webapp). Now that you have loaded the gold standard, you can do things like: gold_standard.contains_judgement? :document => "a document", :query => "the query" gold_standard.relevant? :document => "a document", :query => "the query" Loading the Query Results ------------------------- Now we want to create a new `QueryResultSet`. A query result set can contain more than one result, which is what we normally want. It is important that you specify the gold standard it belongs to. query_result_set = QueryResultSet.new :gold_standard => gold_standard Just like the Gold Standard, you can read a query result set from a file: query_result_set.load_from_yaml_file "my-results-file.yml" Alternatively, you can load the query results one by one. To do this, you have to create the results (either ranked or unranked) and then add documents: my_result = RankedQueryResult.new :query => "the query" my_result.add_document :document => "test_document 1", :score => 13 my_result.add_document :document => "test_document 2", :score => 11 my_result.add_document :document => "test_document 3", :score => 3 This result would be ranked, obviously, and contain three documents. Documents can have a score, but this is optional. You can also create an Array of documents first and add them altogether: documents = Array.new documents << ResultDocument.new :id => "test_document 1", :score => 20 documents << ResultDocument.new :id => "test_document 2", :score => 21 my_result = RankedQueryResult.new :query => "the query", :documents => documents The same applies to `UnrankedQueryResult`s, obviously. The order of ranked documents is the same as the order in which they were added to the result. The `QueryResultSet` will now contain all the results. They are stored in an array called `query_results`, which you can access. So, to iterate over each result, you might want to use the following code: query_result_set.query_results.each_with_index do |result, index| # ... end Or, more simply: for result in query_result_set.query_results # ... end Calculating statistics ---------------------- Now to the interesting part: Calculating statistics. As mentioned before, there is a conceptual difference between ranked and unranked results. Unranked results are much easier to calculate and thus take less CPU time. No matter if unranked or ranked, you can get the most important statistics by just calling the `statistics` method. statistics = my_result.statistics In the simple case of an unranked result, you will receive a hash with the following information: * `precision` - the precision of the results * `recall` - the recall of the results * `false_negatives` - number of not retrieved but relevant items * `false_positives` - number of retrieved but nonrelevant * `true_negatives` - number of not retrieved and nonrelevantv items * `true_positives` - number of retrieved and relevant items In case of a ranked result, you will receive an Array that consists of _n_ such Hashes, depending on the number of documents. Each Hash will give you the information at a certain rank, e.g. the following to lines return the recall at the fourth rank. statistics = my_ranked_result.statistics statistics[3][:recall] In addition to the information mentioned above, you can also get for each rank: * `document` - the ID of the document that was returned at this rank * `relevant` - whether the document was relevant or not Calculating statistics with missing judgements ---------------------------------------------- Sometimes, you don't have judgements for all document/query pairs in the gold standard. If this happens, the results will be cleaned up first. This means that every document in the results that doesn't appear to have a judgement will be removed temporarily. As an example, take the following results: * A * B * C * D Our gold standard only contains judgements for A and C. The results will be cleaned up first, thus leading to: * A * C With this approach, we can still provide meaningful results (for precision and recall). Other statistics ---------------- There are several other statistics that can be calculated, for example the **F measure**. The F measure weighs precision and recall and has one parameter, either "alpha" or "beta". Get the F measure like so: my_result.f_measure :beta => 1 If you don't specify either alpha or beta, we will assume that beta = 1. Another interesting measure is **Cohen's Kappa**, which tells us about the inter-agreement of assessors. Get the kappa statistic like this: gold_standard.kappa This will calculate the average kappa for each pairwise combination of users in the gold standard. For ranked results one might also want to calculate an **11-point precision**. Just call the following: my_ranked_result.eleven_point_precision This will return a Hash that has indices at the 11 recall levels from 0 to 1 (with steps of 0.1) and the corresponding precision at that recall level.
This is a very simple API Wrapper for the Formhub.org platform. It uses the platform JSON endpoints, Please refer to Formhub's developers help page if you want to know more about their API.
This is my first gem and the idea was to develop a super simple and lightweight client to connect with Coingecko API. Keep in mind that this is by no means an oficial gem from Coingecko, it is a MIT license pet project! Use at your own discretion! I published it originally with the methods that I needed for another project, but I'll keep on adding more methods to interact with the rest of the API. Any comments, sugestions, Pull Requests, feel free to contact me.
A Ruby library that provides a flexible engine for developing RESTful API clients.
Simplifies interaction with the SellerCloud SOAP API to enable faster development of integrations.
An Ruby interface for Echo Nest Developer API
This gem is a wrapper on top of the Google API that allows a developer to handle calendars, events, contacts on behalf of the user.
A simple ruby library that let's any developer automate the training process of a Natural Language Processing Engine on API.AI, and retrieve meaning from new utterances.
Api requirements pack for api rails development
This library enables developers to execute Paypal APIs on behalf of merchants.
Provide extensible Sinatra API to support rapid Wechat development
This is a in house tool developed by Palringo to help interface with the GetLocaization API
Create a new instance of YtDataApi::YtDataApiClient passing user credentials (username, password) and YouTube developer key to access YouTube Data API using ClientLogin authentication.
PredictionIO is an open source machine learning server for developers and data scientists to create predictive engines for production environments. This gem provides convenient access to the PredictionIO API for Ruby programmers so that you can focus on application logic.
disable developers and apis with a key/val lookup
It has full HackEx API library, and LibGoo. LibGoo is developed to ease your scripting, so you can easily write and manage your scripts!
Asianodds Disclaimer: This gem is not officially developed by Asianodds and does not belong in any way to the Asianodds service, nor is it supported by their development team and all rights to accept or deny bets made with this gem remain with Asianodds. This gem is a wrapper for the Asianodds Web API. In order to use this gem you need to apply for a Web API account with Asianodds (api@asianodds88.com). Please keep in mind that your regular Asianodds user (for the Web Interface) does not work for your API account and vice versa. The purpose of the gem is to preconfigure all API calls to make your life as easy as just calling one method. You won"t need to MD5 hash your password (as Asianodds requests) and the gem assumes smart preconfigs for your calls, so they will work even without passing in required parameters. Still, it has the same flexibility as the original API without limitations. With just three lines of code you will be able to start placing real-time bets with multiple bookmakers and automate your trading strategies. For more information on the usage of the gem, please visit the github page
# Chef Data Region ## Description Chef Data Region extends the `Chef::DSL::DataQuery` module's `data_bag_item` method with the ability to dynamically expand the data bag name in a configurable, environment-specific manner. ## Motivation This gem exists to address the following scenario: An organization maintains data in Chef data bag items. The data is deployed to several data center environments and is stored in data bags whose names reference the environments. The organization wants to write environment-agnostic recipes that access the data bags without explicitly referencing the data bags by their environment names. As a concrete example, imagine the organization maintains encrypted data for three deployment environments: development, staging, and production. It maintains this data in three data bags, one for each environment, with data for services named `gadget` and `widget` in items: | Environment | Bag | Item | |-------------+----------------+--------| | Development | secure-dev | gadget | | Development | secure-dev | widget | | Production | secure-prod | gadget | | Production | secure-prod | widget | | Staging | secure-staging | gadget | | Staging | secure-staging | widget | The items are encrypted with a key unique to that environment to maximize security. Now consider how a recipe would access these bags. When then recipe is running, it needs to know the data center environment in order to construct the bag name. The organization would most likely assign the enviroment name to a node attribute. In a naive implementation, each recipe would include logic that examined the attribute's value to determine which bag to load. This would obviously duplicate code. Imagine instead that the organization wants to reference the bag by the name `secure` and rely on an _abstraction_ to translate `secure` into the environment-specific bag name. This gem provides that abstraction. ## Features This gem overrides the `data_bag_item` method with configurable logic that dynamically decides which bag to load. It retains API compatibility with `Chef::DSL::DataQuery#data_bag_item`, so existing recipes that call `data_bag_item` work without modification. The gem imposes no constraints on data bag item structure. ## Configuration Assign the region name to a node attribute that varies by environment: node.default['local'][region'] = 'staging' Add the following configuration to Chef Client's `client.rb` file. * Require the gem: require 'chef/data_region' * Configure the gem with a hash that maps a bag name to an expansion pattern: Chef::DataRegion.add( 'secure', { attribute: %w(local region), pattern: 'secure-%<attribute>s' } ) ## Bag name expansion The gem expands bag names using Ruby's `format` method. _More pending..._
Twitterpunch =============== Twitterpunch is designed to work with PhotoBooth and OS X Folder Actions. When this script is called with the name of an image file, it will post the image to Twitter, along with a message randomly chosen from a list and a specified hashtag. If you call the script with the `--stream` argument instead, it will listen for tweets to that hashtag and download them to a specified directory. If the tweet came from another user, Twitterpunch will speak it aloud. Typically, you'll run one copy on an OSX laptop with PhotoBooth, and a separate copy on another machine (either Windows or OSX) for the viewer. You can also use a mobile device as a remote control, if you like. This will allow the user to enter a custom message for each photo that gets tweeted out, if they'd like. Configuration =========== Configure the program via the `~/.twitterpunch/config.yaml` YAML file. This file should look similar to the example below. --- :twitter: # twitter configuration :consumer_key: <consumer key> :consumer_secret: <consumer secret> :access_token: <access token> :access_token_secret: <access secret> :messages: # list of messages to attach - Hello there # to outgoing tweets - I'm a posting fool - minimally viable product :hashtag: Twitterpunch # The hashtag to post and listen to :handle: Twitterpunch # The twitter username to post as :photodir: ~/Pictures/twitterpunch/ # Where to save downloaded images :logfile: ~/.twitterpunch/activity.log # Where to save logs :viewer: # Use the built-in slideshow viewer :count: 5 # How many images to have onscreen at once :remote: :timeout: 45 # How long the button should remain disabled for :apptitle: dslrBooth # The photo booth application title :hotkey: space # Which hotkey to send to trigger a photo 1. Generate a skeleton configuration file * `twitterpunch --configure` 1. Edit the configuration file as needed. You'll be prompted with the path. * If you have your own Twitter application credentials, you're welcome to use them. 1. Authorize the application with the Twitter API. * `twitterpunch --authorize` Usage ========== ### Using OS X PhotoBooth 1. Start PhotoBooth at least once to generate its library. 1. Install the Twitterpunch Folder Action * `twitterpunch --install` * It may claim that it could not be attached, fear not. 1. Profit! * _and by that, I mean take some shots with PhotoBooth!_ *Note*: if the folder action doesn't seem to work and photos aren't posted to Twitter, here are some troubleshooting steps to take: 1. Run Twitterpunch by hand with photos as arguments. This may help you isolate configuration or authorization issues. * `twitterpunch foo.jpg` 1. Correct the path in the workflow. * `which twitterpunch` * Edit the Twitterpunch folder action to include that path. #### Using the remote web app Configure the remote web app using the `:remote` hash in `config.yaml`. You can usually find the title of the app using `system_profiler -detailLevel full SPApplicationsDataType` and grepping for the name or path to the `.app`. In this example, the title is _dslrBooth_. [ben@ganymede] ~ $ system_profiler -detailLevel full SPApplicationsDataType | grep -B8 dslrBooth.app dslrBooth: Version: 2.9 Obtained from: Identified Developer Last Modified: 10/14/17, 9:50 PM Kind: Intel 64-Bit (Intel): Yes Signed by: Developer ID Application: Hope Pictures LLC (MZR5GHAQX4), Developer ID Certification Authority, Apple Root CA Location: /Applications/dslrBooth.app 1. Run the app with `twitterpunch --remote` 1. Browse to the app with http://{address}:8080 1. [optional] If on an iOS device, add to your homescreen * This will give you "app behaviour", such as full screen, and a nice icon #### Troubleshooting. 1. Make sure the folder action is installed properly 1. Use the Finder to navigate to `~/Pictures/` 1. Right click on the `Photo Booth Library` icon and choose _Show Package Contents_. 1. Right click on the `Pictures` folder and choose `Services > Folder Actions Setup` 1. Make sure that the `Twitterpunch` action is attached. 1. Install the folder action 1. Open the `resources` folder of this gem. * Likely to be found in `/Library/Ruby/Gems/{version}/gems/twitterpunch-#{version}/resources/`. 1. Double click on the `Twitterpunch` folder action and install it. * It may claim that it could not be attached, fear not. ### Using something besides PhotoBooth Configure the program you are using for your photo shoot to call Twitterpunch each time it snaps a photo. Pass the name of the new photo as a command line argument. Alternatively, you could batch them, as Twitterpunch can accept multiple files at once. [ben@ganymede] ~ $ twitterpunch photo.jpg [photo2.jpg photo3.jpg photo4.jpg] You can manually install the Folder Action, or you can follow the automated install process after tweaking the workflow slightly. 1. Identify where the app stores the resulting image files. 1. Edit the Twitterpunch folder action to include that path. 1. Follow the steps above to install the Folder Action. ### Viewing the Twitter stream Twitterpunch will run on OS X or Windows equally well. Simply configure it on the computer that will act as the Twitter display and then run in streaming mode. [ben@ganymede] ~ $ twitterpunch --stream There are two modes that Twitterpunch can operate in. 1. If a `:hashtag` is defined then all images tweeted to the configured hashtag will be displayed in the slideshow. 1. Otherwise, Twitterpunch will stream the `:handle` Twitter user's stream and display all images either posted by that user or addressed to that user. With protected tweets, you can have rudimentary access control. In either mode, tweets that come from any other user will also be spoken aloud. If you don't want to use the built-in slideshow viewer, you can disable it by removing the `:viewer` key from your `~/.twitterpunch/config.yaml` config file. Twitterpunch will then simply download the tweeted images and save them into the `:photodir` directory. You can then use anything you like to view them. There are currently two decent viewing options I am aware of. * Windows background image: * Configure the Windows background to randomly cycle through photos in a directory. * Hide desktop icons. * Hide the taskbar. * Disable screensaver and power savings. * Drawbacks: You're using Windows and you have to install Ruby & RubyGems manually. * OS X screensaver: * Choose one of the sexy screensavers and configure it to show photos from the `:photodir` * Set screensaver to a super short timeout. * Disable power savings. * Drawbacks: The screensaver doesn't reload dynamically, so I have to kick it and you'll see it reloading each time a new tweet comes in. Limitations =========== * It currently requires manual setup for Folder Actions. * Rubygame is kind of a pain to set up. Contact ======= * Author: Ben Ford * Email: binford2k@gmail.com * Twitter: @binford2k * IRC (Freenode): binford2k
This gem gives a developer a set of tools for securing access to their endpoints.
Gem to consume APIs available at Leroy Merlin Brazil Developer's Portal https://developers.leroymerlin.com.br
Extends String with datamuse functions. Datamuse https://www.datamuse.com/api/ is an open api described as a word-finding query engine for developers.
This gem uses Net::HTTP to communicate with the PayPal servers. It has calls for the most common PayPal functions to simplify using PayPal in a Ruby application. I developed this package as a way to call PayPal's REST API with greater transparancy than the paypal-sdk-rest gem has, making development easier. This package can be easily extended by simply adding additional methods (as files) to the lib/paypkg folder. Contributions welcome.
Client library for Kakao Developers Open Platform API
This gem is supposed to supply a simple and intuitive interface between IBGE's SIDRA API, standardizing the output objects as a standard class. Also makes seamlessly to interact with the API, transforming the inputs for the webservice in intuitive function entries. Eia stands for Easy IBGE Acess and was developed in a partnership with Tendencias - Consultoria Econômica.
ROS Ruby Client: rosruby ======= [ROS](http://ros.org) is Robot Operating System developed by [Willow Garage](http://www.willowgarage.com/) and open source communities. This project supports ruby ROS client. You can program robots by ruby, very easily. **Homepage**: http://otl.github.com/rosruby **Git**: http://github.com/OTL/rosruby **Author**: Takashi Ogura **Copyright**: 2012 **License**: new BSD License **Latest Version**: 0.2.0 Requirements ---------- - ruby (1.8.x/1.9.x) - ROS (electric/fuerte) - ROS requires python2.7 or more libraries Let's start --------------- Install ROS and ruby first. ROS document is [http://ros.org/wiki/ROS/Installation](http://ros.org/wiki/ROS/Installation) . You can install ruby by apt. ```bash $ sudo apt-get install ruby ``` Download rosruby into your ROS_PACKAGE_PATH. ````bash $ git clone git://github.com/OTL/rosruby.git ``` please add RUBYLIB environment variable, like below (if you are using bash). ```bash $ echo "export RUBYLIB=`rospack find rosruby`/lib" >> ~/.bashrc $ source ~/.bashrc ``` To use with precompiled electric release ----------------------- If you are using precompiled ROS distro, use the msg/srv generation script (rosruby_genmsg.py) If you are using ROS from source, it requires just recompile the msg/srv packages by rosmake rosruby. ```bash $ rosrun rosruby rosruby_genmsg.py ``` This converts msg/srv to .rb which is needed by sample programs. If you want to make other packages, add package names for args. For example, ```bash $ rosrun rosruby rosruby_genmsg.py geometry_msgs nav_msgs ``` Sample Source -------------- ## Subscriber ```ruby #!/usr/bin/env ruby require 'ros' require 'std_msgs/String' node = ROS::Node.new('/rosruby/sample_subscriber') node.subscribe('/chatter', Std_msgs::String) do |msg| puts "message come! = \'#{msg.data}\'" end while node.ok? node.spin_once sleep(1) end ``` ## Publisher ```ruby #!/usr/bin/env ruby require 'ros' require 'std_msgs/String' node = ROS::Node.new('/rosruby/sample_publisher') publisher = node.advertise('/chatter', Std_msgs::String) msg = Std_msgs::String.new i = 0 while node.ok? msg.data = "Hello, rosruby!: #{i}" publisher.publish(msg) sleep(1.0) i += 1 end ``` Note ---------------- Ruby requires 'Start with Capital letter' for class or module names. So please use **S**td_msgs::String class instead of **s**td_msgs::String. Try Publish and Subscribe ---------------------- You needs three terminal as it is often for ROS users. Then you run roscore if is not running. ```bash $ roscore ``` run publisher sample ```bash $ rosrun rosruby sample_publisher.rb ``` run subscription sample ```bash $ rosrun rosruby sample_subscriber.rb ``` you can check publication by using rostopic. ```bash $ rostopic list $ rostopic echo /chatter ``` Try Service? ---------------------- ```bash $ rosrun rosruby add_two_ints_server.rb ``` run client with args ('a' and 'b' for roscpp_tutorials/TwoInts) ```bash $ rosrun rosruby add_two_ints_client.rb 10 20 ``` and more... ---------------------- You need more tools for testing, generating documentations. ```bash $ sudo apt-get install rake gem $ sudo gem install yard redcarpet simplecov ``` do all tests ------------------------- run roscore if is not running. ```bash $ roscore ``` and run the unit tests. ```bash $ roscd rosruby $ rake test ``` documents -------------------------- you can generate API documents using yard. Document generation needs yard and redcarpet. You can install these by gem command like this. ```bash $ gem install yard redcarpet ``` Then try to generate documentds. ```bash $ rake yard ``` You can access to the generated documents from [here](http://otl.github.com/rosruby/doc/).
Hodor is a ruby-based framework, API and Command Line Interface that automates and simplifies the way you specify, deploy, debug and administer Hadoop and Oozie solutions. Hadoop lacks a mature toolchain to manage a codebase with modern software development discipline. To address this need, Hodor comprises a combination of tools and conventions that enable in the Hadoop environment many of the modern software development practices, and deployment facilities we take for granted in normal software development.
[Backported security patched version] A Ruby framework for rapid API development with great conventions.
API wrapper for use in conjunction with an Elvanto account. This wrapper can be used by developers to develop programs for their own churches using an API Key, or to design integrations to share to other churches using OAuth authentication.
SafePoller is a Ruby gem that provides a safe and reliable way to perform periodic polling operations in multi-threaded environments. It offers a simple and intuitive API for running a block of code at a defined interval, while ensuring thread safety and preventing potential race conditions. SafePoller allows developers to focus on implementing their logic without worrying about thread safety issues.
Howcast offers an Application Programming Interface (API) which allows developers to build applications that interface with Howcast. The Howcast API is RESTful (REpresentational State Transfer) and users of this API will be able: 1) Retreive detailed information about a single video, including metadata such as title, description, video views, rating etc; 2) Retrieve a list of videos restricted by a set of filters offered by Howcast and sorted using several metrics that you can specify (most recent, most views, etc); 3) Search for video; 4) And much more. Note: Before you can use our APIs, you must register an API key, that is submitted with each request.
Blockfrost is a hosted API as a service serving the data from the Cardano blockchain. This gem is Ruby SDK for Blockfrost.io to enable developers to use full power of this API without having to create basic functions for it. The OpenAPI documentation is hosted at https://github.com/blockfrost/openapi. This gem is licensed under ASL 2.0.
Raw provides everything you need to develop professional Web applications using Ruby and Javascript. Raw redefines Rapid Application Development by providing a clean, yet efficient API and a layer of domain specific languages implemented on top of Ruby. Raw is Web 2.0 ready, featuring excellent support for AJAX, XML, Syndication while staying standards compliant. Raw is part of the Nitro Mega Framework.
Turnitin Core API (TCA) provides direct API access to the core functionality provided by Turnitin. TCA supports file submission, similarity report generation, group management, and visualization of report matches via Cloud Viewer or PDF download. Below is the full flow to successfully set up an integration scope, an API Key, and make calls to TCA. Integration Scope and API Key management is done via the Admin Console UI by logging in as an admin user. For more details, go to our [developer portal documentation page](https://developers.turnitin.com/docs). ## Integration Scope and API Key Management TCA API calls must provide an API Key for authentication, so you must first have at least one integration scope associated with at least one API Key to use TCA. ### Admin Console UI First, login to Admin Console UI as an *Admin* user with permission to create Integration Scopes, under a tenant that is licensed to use the TCA product Integration Scopes (you can create a new one, or add keys to existing) * Click `Integrations` in the side bar --> `+ Add Integration` at top the top of the page --> Enter a name --> `Add` Button API Keys * Click `Integrations` in the side bar --> `Create API Key` Button next to a given Integration Scope --> Enter a name --> click `Create and View button` * Copy/Save the key manually or click save to clipboard button to copy it (this is the only time it will show) ## TCA Flow * Register a webhook * Create a submission * Upload a file for the submission * Wait for the submission upload to process * If you registered a webhook, a callback will be sent to it when upload is complete * The status of the *submission* will also update to `COMPLETE` * Request a Similarity Report * Wait for similarity report to process * If you registered a webhook, a callback will be sent to it when report is complete * The status of the *report* will also be updated to `COMPLETE` * Request a URL with parameters to view the Similarity Report
A Sinatra/MongoDB API server to use for EmberJS development
This gem provides a class for interacting with the DopamineAPI from ruby. After you have received your API key and configured the actions and reinforcements relevant to your app on the [Dopamine Developer Dashboard](dashboard.usedopamine.com), you may use this gem to place 'tracking', and 'reinforcement' calls from inside your app that will communicate directly with the DopamineAPI.
The Apigee Registry API allows teams to upload and share machine-readable descriptions of APIs that are in use and in development. These descriptions include API specifications in standard formats like OpenAPI, the Google API Discovery Service Format, and the Protocol Buffers Language. These API specifications can be used by tools like linters, browsers, documentation generators, test runners, proxies, and API client and server generators. The Registry API itself can be seen as a machine-readable enterprise API catalog designed to back online directories, portals, and workflow managers.
Payture API for Ruby. To get started with Payture, you will need an account, please contact Payture directly. Please also note that this module was not developed by Payture.
An API key is a simple encrypted string that you can use when calling Google Cloud APIs. The API Keys service manages the API keys associated with developer projects.
Rack provides minimal, modular and adaptable interface for developing web applications in Ruby. By wrapping HTTP requests and responses in the simplest way possible, it unifies and distills the API for web servers, web frameworks, and software in between (the so-called middleware) into a single method call. Also see http://rack.rubyforge.org.
This plugin publishes artifacts over to Dropbox via a generated api token from Dropbox Developer API.
A ruby library for the Blip.fm API. Developed to facilitate the development of Blip.fm (social media service centered around music) applications in ruby.
Multi-purpose Ruby wrapper for the Hubspot CRM API with access to the raw response or Ruby objects for faster development
SmartLink provides a web portal which connects SmartLine controllers in the field to the cloud allowing basic operations like starting or stopping irrigation for a specific zone on a specified controller. All controllers which belong to a customer account can be managed via the web portal, iOS and Android applications, or this API. The current version is v2 of the API. You must have a developer key in order to use this version.