diff --git a/tutorials/recommendations-with-shaped-ai/conclusion.md b/tutorials/recommendations-with-shaped-ai/conclusion.md new file mode 100644 index 000000000..bc9b79fe3 --- /dev/null +++ b/tutorials/recommendations-with-shaped-ai/conclusion.md @@ -0,0 +1,20 @@ +--- +position: 8 +title: Conclusion +--- + +# Conclusion + +In this tutorial, we have explored the **Shaped.ai** solution accelerator for feeding Shaped.ai with Snowplow data, enabling customers to update recommendations for their end-users in real-time. + +We successfully have built a real-time system for processing event data including: +- Initial training data for Shaped.ai, to jump-start a base recommendation system with seminal data; +- Continuously feeding AWS Personalization recommendation engine with new data once this new data is generated in real-time. + +This tutorial can be extended to utilize Snowplow event data for other real-time use cases, such as: +- Web Engagement analytics; +- Ad performance tracking. + +## Next Steps +- Extend tracking: extend the solution to track more granular user interactions or track on a new platform such as mobile +- Expanding use cases: Shaped.ai can be used not just for recommendations, but also for hyperpersonalization based on customers' history diff --git a/tutorials/recommendations-with-shaped-ai/impression-integration.md b/tutorials/recommendations-with-shaped-ai/impression-integration.md new file mode 100644 index 000000000..c75c76199 --- /dev/null +++ b/tutorials/recommendations-with-shaped-ai/impression-integration.md @@ -0,0 +1,168 @@ +--- +title: Impression Integration +position: 7 +--- + +To "close the loop" and allow Shaped.ai to improve the performance of it's recommendations, we need to feed it back information about how they perform. + +We want it to know which of its recommendations actually got clicked so it can account for that in its models and optimize the performance of individual recommendations. + +For this to work, we need to track the recommendation IDs and clicks of the widgets we render based on its suggestions. + +To do this, we need several adjustments to what we've built so far. We need to change: + +- The site tracking to track the clicks (we will also track impressions generally so we can track performance) +- The Snowbridge config to account for the click events + +## Tracking Impressions + +First up we update our recommendation widget snippet to track the impressions and clicks. + +We will use the [Element Tracking plugin](https://github.com/snowplow/snowplow-javascript-tracker/pull/1400) to implement this. + +```javascript +snowplow("addPlugin", "/cdn/shop/t/3/assets/element-tracker.umd.min.js", ["snowplowElementTracking", "SnowplowElementTrackingPlugin"]); + +// set up impression tracking +snowplow("startElementTracking", { + elements: { + name: "recommendation-impression", // name our configuration something logical + selector: "[data-recommendation-id]", // selector will vary based on the widget implementation + expose: { when: "element", minPercentage: .5 }, // once per widget, only once it is 50% in view + component: true, // mark it as a component so we can get clicks + details: { dataset: ["recommendationId"] }, // extract the recommendation ID + contents: { + name: "recomendation-item", + selector: "[data-item-id]", + details: { dataset: ["itemId"] } // also extract the shown item IDs + } + } +}); + +// set up click tracking +snowplow("getComponentListGenerator", function (_, componentGeneratorWithDetail) { + document.addEventListener("click", function(e) { + if (e.target.closest("a") && e.target.closest("[data-recommendation-id]")) { + const target = e.target.closest("a"); + const details = componentGeneratorWithDetail(target); + snowplow("trackLinkClick", { element: target, context: details }); + } + }, false); +}); +``` + +With this configuration, whenever our custom recommendations widget is in-view, we will fire an `expose_element` event like the following: + +```json +{ + "schema": "iglu:com.snowplowanalytics.snowplow/expose_element/jsonschema/1-0-0", + "data": { + "element_name": "recommendation-impression" + } +} +``` + +This event will have an `element` entity describing our widget, including the recommendation/impression ID, like so: + +```json +{ + "schema": "iglu:com.snowplowanalytics.snowplow/element/jsonschema/1-0-0","data": { + "element_name": "recommendation-impression", + "width": 1600, + "height": 229.4166717529297, + "position_x": 160, + "position_y": 531.5, + "doc_position_x": 160, + "doc_position_y": 3329, + "element_index": 1, + "element_matches": 1, + "originating_page_view": "3d775590-74c6-4d0a-85ee-4d63d72bda2d", + "attributes":[ + { + "source": "dataset", + "attribute": "recommendationId","value": "RID-24-4a6a-8380-506b189ff622-CID-529b19" + } + ] + } +} +``` + +And it will also contain `element_content` entities for each item in the widget, capturing their product IDs, like the following: + +```json +{ + "schema": "iglu:com.snowplowanalytics.snowplow/element_content/jsonschema/1-0-0", + "data": { + "element_name": "recomendation-item", + "parent_name": "recommendation-impression", + "parent_position": 1, + "position": 1, + "attributes": [ + { + "source": "dataset", + "attribute": "itemId", + "value": "48538191331628" + } + ] + } +} +``` + +In addition, if the links in the widget are clicked, we'll generate a regular `link_click` event -- but because our widget is defined as a component, it will extract the same entities as an impression and include those, too. + +These `link_click` events are what we need to detect and forward to Shaped.ai. + +## Snowbridge Impressions and Clicks + +Now that we need to know about `link_click`, we need to include those in our filter: + +```hcl + regex = "^(snowplow_ecommerce_action|action|view_item|transaction_item|create_order)$" # before + + regex = "^(snowplow_ecommerce_action|action|view_item|transaction_item|create_order|link_click)$" # after +``` + +Our custom transform then needs to be aware of them: + +```javascript + case 'link_click': // recommendation clicks + ep.Data.event_type = "Click"; + + const element = event.contexts_com_snowplowanalytics_snowplow_element_1 || []; + const content = event.contexts_com_snowplowanalytics_snowplow_element_content_1 || []; + + if (!element.length) return SKIP_EVENT; // unrelated link_click + if (!content.length) return SKIP_EVENT; // unrelated link_click + + let impressionId = null; + + element.forEach((e) => { + if (e.element_name !== "recommendation-impression") return; // some other element/component + if (e.attributes) { + e.attributes.forEach((a) => { + if (a.source === "dataset" && a.attribute === "recommendationId") { + impressionId = a.value; + } + }); + } + }); + + if (!impressionId) return SKIP_EVENT; // couldn't find impression info + + const items = []; + + content.forEach((ec) => { + if (ec.parent_name !== "recommendation-impression") return; + items.push(ec.attributes[0].value); + }); + + ep.Data.item_ids = items; // for simplicity we will pretend the first item was the clicked one + ep.Data.impression_id = impressionId; + break; + default: + return SKIP_EVENT; +``` + +Snowbridge will now send our clicked recommendation events to Shaped.ai. + +Shaped.ai will now be able to optimize its recommendations based on how they perform. diff --git a/tutorials/recommendations-with-shaped-ai/introduction.md b/tutorials/recommendations-with-shaped-ai/introduction.md new file mode 100644 index 000000000..c9de52a2e --- /dev/null +++ b/tutorials/recommendations-with-shaped-ai/introduction.md @@ -0,0 +1,25 @@ +--- +title: Introduction +position: 1 +--- + +[Shaped.ai](https://shaped.ai) is a ML-based solution to provide personalization, recommendations, and search optimization capabilities to end-users. It can use Snowplow data [to build different use cases](https://docs.shaped.ai/docs/use_cases/overview/). + +Shaped.ai offers a [REST API](https://docs.shaped.ai/docs/api), as well as [SDKs for Python and JavaScript](https://docs.shaped.ai/docs/overview/install-sdk), a [CLI in Python](https://docs.shaped.ai/docs/overview/installing-shaped-cli), and a [UI/UX interface (the dashboard)](https://dashboard.shaped.ai/). + +This accelerator demonstrates how Snowplow data can be used to feed Shaped.ai models. Any version of Snowplow that supports Snowbridge can be used, such as [Snowplow Local](https://github.com/snowplow-incubator/snowplow-local). For testing purposes, we recommend generating events using one of our examples that work with our out-of-the-box ecommerce events, like our [**Snowplow ecommerce store**](https://github.com/snowplow-industry-solutions/ecommerce-nextjs-example-store). + +## Key technologies + +* Snowplow: event tracking pipeline (Collector, Enrich, Kinesis sink) +* [Snowbridge](/docs/api-reference/snowbridge/): event forwarding module, part of Snowplow +* AWS Kinesis: message broker, set by Shaped.ai team, to receive events from Snowbridge + +### Event capture and ingestion with Snowplow + +- E-store front-end and Snowplow JavaScript tracker: user activity is captured as Snowplow ecommerce events +- Snowplow to Shaped.ai: the Snowplow pipeline validates the events, enriches them with device and geolocation data, then forwards them into Shaped.ai AWS Kinesis instance + +## Acknowledgements + +Thank you to the [Shaped.ai](https://shaped.ai) team for all the support while building this accelerator. diff --git a/tutorials/recommendations-with-shaped-ai/meta.json b/tutorials/recommendations-with-shaped-ai/meta.json new file mode 100644 index 000000000..d7418e054 --- /dev/null +++ b/tutorials/recommendations-with-shaped-ai/meta.json @@ -0,0 +1,8 @@ +{ + "title": "Build a recommendations system with Shaped.ai", + "label": "Solution accelerator", + "description": "Use Snowplow data to build a recommendations system with Shaped.ai.", + "useCases": ["Real-time personalization"], + "technologies": ["Shaped.ai"], + "snowplowTech": ["Snowbridge"] +} diff --git a/tutorials/recommendations-with-shaped-ai/real-time-integration.md b/tutorials/recommendations-with-shaped-ai/real-time-integration.md new file mode 100644 index 000000000..a0f39a20e --- /dev/null +++ b/tutorials/recommendations-with-shaped-ai/real-time-integration.md @@ -0,0 +1,200 @@ +--- +title: Real-time Integration +position: 6 +--- + +Now that we have recommendations appearing on site and influencing behavior, we want to give Shaped.ai a feed of events as they occur so it can keep up to date with its suggestions. + +To do this, we'll utilize Snowbridge to intercept events coming from the site, and send them to Shaped.ai. [Shaped.ai receives our Snowbridge events through AWS Kinesis](https://docs.shaped.ai/docs/connectors/snowplow), either using Snowbridge or using Custom Kinesis Forwarding, so the Kinesis connector should be configured beforehand. The best way to request this is [through their Slack server](https://docs.shaped.ai/docs/support/contact/), but email requests also work. + +## Kinesis Connector Creation Flow + +When contacting Shaped.ai team, make sure the have your AWS Account ID ready. After providing the information, Shaped.ai team will give you AWS Kinesis credentials, that will be used by Snowplow team to configure Snowbridge. They look like this: + +``` +"kinesisStreamArn": "arn:aws:kinesis:us-east-2:1234567890:stream/ShapedDatasetStream-1234567890e3fad47fa8eabf03a4", +"kinesisIamRoleArn": "arn:aws:iam::1234567890:role/ShapedDatasetAccessRole-1234567890e3fad47fa8eabf03a4", +``` + +Open a ticket with Snowplow support, providing these credentials, and additional pieces of information, as follows: + +- All payloads should be sent as JSON, not as TSV (the default); +- Not all the events should be sent. We will need to filter events to use only events with an `item_ids` property defined. To do this in Snowbridge, we will need to apply a specific configuration, that filters the event we will forward, as well as JS transform that will check if our events have the fundamental pieces of information to be forwarded. The configuration is a HCL file (normally named `config.hcl`), and its contents are like this one below: + +```hcl +transform { + use "spEnrichedFilter" { + atomic_field = "event_name" + regex = "^(snowplow_ecommerce_action|action|view_item|transaction_item|create_order)$" # filter to only ecommerce events we're interested in + filter_action = "keep" + } +} + +transform { + use "js" { + script_path = "/tmp/transform.js" # use a custom JS transform on the payload; not all ecommerce events can be filtered by just event_name so we need to do more checks there + snowplow_mode = true # turn the TSV record into JSON for our transform function to handle + } +} +``` + +And we use the following as `transform.js` (it is more complicated than required because it accounts for different ecommerce plugins): +```javascript +/** + * @typedef {object} EngineProtocol + * @property {boolean} [FilterOut] + * @property {string} [PartitionKey] + * @property {string | object} [Data] + * @property {Record} [HTTPHeaders] + */ + +const SKIP_EVENT = { FilterOut: true }; + +/** + * @param {EngineProtocol} ep + * @returns {EngineProtocol} + */ +function main(ep) { + if (typeof ep.Data === "string") return SKIP_EVENT; // we should be in snowplow_mode + + const event = ep.Data; + + const ts = (event.derived_tstamp || event.collector_tstamp).UnixMilli(); + + const client_session = (event.contexts_com_snowplowanalytics_snowplow_client_session_1 || [])[0] || {}; + + ep.Data = { + event_id: event.event_id, + event_type: "", + user_id: event.user_id || event.domain_userid || client_session.userId, + session_id: event.domain_sessionid || client_session.sessionId, + item_ids: undefined, + sent_at: ts, + }; + + let payload = undefined; + let products = undefined; + + switch (event.event_name) { + case 'transaction_item': // classic ecommerce + ep.Data.event_type = "Purchase"; + ep.Data.item_ids = [event.ti_sku]; + break; + case 'action': // enhanced ecommerce + payload = event.unstruct_event_com_google_analytics_enhanced_ecommerce_action_1; + products = event.contexts_com_google_analytics_enhanced_ecommerce_product_1; + if (!payload || !payload.action || !products) return SKIP_EVENT; + ep.Data.item_ids = products.map((i) => i.id); + if (payload.action === "view") { + ep.Data.event_type = "View"; + } else if (payload.action === "click") { + ep.Data.event_type = "Click"; + } else if (payload.action === "purchase") { + ep.Data.event_type = "Purchase"; + } else return SKIP_EVENT; + break; + case 'snowplow_ecommerce_action': // snowplow ecommerce + payload = event.unstruct_event_com_snowplowanalytics_snowplow_ecommerce_snowplow_ecommerce_action_1; + products = event.contexts_com_snowplowanalytics_snowplow_ecommerce_product_1; + if (!payload || !payload.type || !products) return SKIP_EVENT; + ep.Data.item_ids = products.map((i) => i.id); + if (payload.type === "product_view") { + ep.Data.event_type = "View"; + } else if (payload.type === "list_view") { + ep.Data.event_type = "View"; // ??? + } else if (payload.type === "list_click") { + ep.Data.event_type = "Click"; + } else if (payload.type === "transaction") { + ep.Data.event_type = "Purchase"; + } else return SKIP_EVENT; + break; + case 'view_item': // hyper-t ecommerce + payload = event.unstruct_event_io_snowplow_ecomm_view_item_1; + if (!payload) return SKIP_EVENT; + ep.Data.event_type = "View"; + ep.Data.item_ids = [payload.item_id]; + break; + case 'create_order': // hyper-t ecommerce + ep.Data.event_type = "Purchase"; + payload = event.contexts_io_snowplow_ecomm_cart_1; + if (!payload || !payload.items_in_cart) return SKIP_EVENT; + ep.Data.item_ids = payload.items_in_cart.map((i) => i.item_id); + break; + default: + return SKIP_EVENT; + } + + if (!ep.Data.item_ids || !ep.Data.event_type) return SKIP_EVENT; + return ep; +} +``` + +Now Snowbridge will: +- Filter the event stream to ecommerce events +- Transform them into a common format +- Submit the interaction to Shaped.ai as a real-time event + +This allows Shaped.ai to react to new behaviour, and it will periodically retrain itself and adjust its models to accomodate the newer observations. + +### Merging Snowbridge Dataset with Previously Imported Datasets + +Now, we need to merge the previously imported data with the live data provided by Snowbridge. The following YAML file not just unifies both datasets into a new one, as well as separates `item_ids` (sent by Snowbridge) into one event per item ID, per Shaped.ai's requirements: + +```yml +model: + description: A model to test Snowplow data integration with online and offline data. + name: model_initial_data_plus_snowbridge +connectors: + - id: recommendations_ecomm + name: recommendations_ecomm + type: Dataset + - id: recommendations_ecomm_model_interactions + name: recommendations_ecomm_model_interactions + query: | + SELECT + DOMAIN_USERID as user_id, + PRODUCT_ID as item_id, + toDateTime(toInt64(DERIVED_TSTAMP) / 1000) as created_at, + ECOMMERCE_ACTION_TYPE as event_value + FROM recommendations_ecomm_model_interactions + type: Dataset + - id: interactions + name: Snowplow + query: | + select distinct + arrayJoin( + splitByChar(',', translate(ifNull(item_ids, ''), '[]"', '')) + ) AS item_id, + domain_userid as user_id, + event_name as event_value, + dvce_created_tstamp as created_at + from Snowplow + type: Dataset +fetch: + events: | + SELECT + user_id, + item_id, + created_at, + 1 AS label, + event_value + FROM recommendations_ecomm_model_interactions + + UNION ALL + + SELECT + user_id, + item_id, + created_at, + 1 AS label, + event_value + FROM interactions +``` + +Using Shaped.ai CLI, and supposing the YAML above is saved in a file with the name `model_definition_initial_data_plus_snowbridge.yaml`, you can run the following command: + +``` +$ shaped create-model --file ./model_definition_initial_data_plus_snowbridge.yaml +``` + +Make sure to update your previous calls to Shaped.ai from `testing_snowplow_model` to `model_initial_data_plus_snowbridge`. \ No newline at end of file diff --git a/tutorials/recommendations-with-shaped-ai/serving-recommendations.md b/tutorials/recommendations-with-shaped-ai/serving-recommendations.md new file mode 100644 index 000000000..b54e6b453 --- /dev/null +++ b/tutorials/recommendations-with-shaped-ai/serving-recommendations.md @@ -0,0 +1,34 @@ +--- +title: Serving Recommendations +position: 5 +--- + +After the model is calculated in Shaped.ai, we can retrieve recommendation results utilizing their Rank API. This can be done in different ways. For instance, through a CLI command: + +```sh +$ shaped rank --model-name testing_snowplow_model --user-id 8a6fa6f8-7e0f-4ec5-bf38-94883fd7da6f +``` + +In a normal execution, recommendations will be a `RankResponse` object with two main properties: `ids` and `scores`. Each `id` pairs with each `score`. According to Shaped.ai docs on the topic, each score... + +> has the respective relevance confidence we have that this item is relevant to the query user. You can use this to get a bit more of an understanding of how the relevancy estimates change throughout the ranking. + +The previous example makes it very easy to build a small recommendations service that can be publicly accessible. The complete API is a very good way to learn how to use Shaped.ai APIs in any stack, as well as showcasing additional endpoints that can be used. For instance: + +```py +import shaped + +SHAPED_AI_API_KEY = 'your-shaped-ai-api-key' + +client = shaped.Client(SHAPED_AI_API_KEY) + +# Fetch similar items +similar_items = client.similar_items( + "testing_snowplow_model", + "48538257883436" +) + +print(similar_items) +``` + +With this as a basis, we might add this logic to any server endpoint, passing the user ID as a query string parameter, or a body parameter, retrieving the item IDs we need for the personalization, as well as additional infomation about each item. \ No newline at end of file diff --git a/tutorials/recommendations-with-shaped-ai/setup.md b/tutorials/recommendations-with-shaped-ai/setup.md new file mode 100644 index 000000000..47c601448 --- /dev/null +++ b/tutorials/recommendations-with-shaped-ai/setup.md @@ -0,0 +1,8 @@ +--- +position: 2 +title: Installation and setup +--- + +This tutorial can be executed with a [Shaped.ai trial account](https://dashboard.shaped.ai/register) and [our corresponding Colab notebook](https://colab.research.google.com/drive/1m5ZXBAWnWJPWSedk4YRJ-kqYD4g3ElGt). The steps described in the notebook can be followed into your local Python instance, as long as you're using a Python version between 3.8 and 3.11 for Shaped.ai CLI. + +Supporting files can be found at [the corresponding GitHub repository](https://github.com/snowplow-industry-solutions/ecommerce-recsys-with-shaped-ai). \ No newline at end of file diff --git a/tutorials/recommendations-with-shaped-ai/tracking-setup.md b/tutorials/recommendations-with-shaped-ai/tracking-setup.md new file mode 100644 index 000000000..e42a65b4c --- /dev/null +++ b/tutorials/recommendations-with-shaped-ai/tracking-setup.md @@ -0,0 +1,19 @@ +--- +title: Tracking Setup +position: 3 +--- + +The recommended tracking setup relies on events from the ["Hyper Transactional" e-commerce schema](https://iglucentral.com/?q=io.snowplow.ecomm). This includes events like: + +- Page views +- Product views +- Product click +- Add to cart / Remove from cart +- Collection viewed +- Search +- Checkout start +- Purchase (checkout end) + +Of these, we will use the Product View events as the main signal to train Shaped.ai on. Most code we deploy will also work with the other e-commerce plugins. + +We will also use [the Element Tracking plugin](https://github.com/snowplow/snowplow-javascript-tracker/tree/master/plugins/browser-plugin-element-tracking)'s `expose_element` events for measuring impressions of our Recommendations once we are serving them. \ No newline at end of file diff --git a/tutorials/recommendations-with-shaped-ai/training-data.md b/tutorials/recommendations-with-shaped-ai/training-data.md new file mode 100644 index 000000000..5a38c769c --- /dev/null +++ b/tutorials/recommendations-with-shaped-ai/training-data.md @@ -0,0 +1,91 @@ +--- +title: Training Data +position: 4 +--- + +In order for the Shaped.ai model to serve usable results, we need to give it some initial training on our actual store. +Out of the box Shaped.ai will have no idea about our customers or products, let alone the relationships between them and what makes a good recommendation. + +To solve this "cold start" problem, we need to export our catalog information for Shaped.ai to read, and give it an initial training dataset of interactions for it to build a model off of. + +From this state a real Snowplow user would ideally be using our [E-Commerce DBT models](https://github.com/snowplow/dbt-snowplow-ecommerce). + +As output from those models, of most interest to us for use with Shaped.ai is the `snowplow_ecommerce_product_interactions` model. This model includes interactions users have with products, including views and purchases - the information we need for Shaped.ai models! + +Once the model runs and `snowplow_ecommerce_product_interactions` table gets populated, a Parquet file can be generated using one of the following guides, accoding to your data warehouse technology: + +- Snowflake: https://docs.snowflake.com/en/user-guide/tutorials/script-data-load-transform-parquet +- Databricks: https://docs.databricks.com/aws/en/query/formats/parquet#options + +From this we can start to look at the requirements for Shaped.ai and what we need to transform this data into a format that matches its requirements for a Product Interactions data set. + +We need: + +- At least 1000 View events +- At least 25 users +- 4 columns per event: + - `USER_ID`: We'll map this to the customer ID or Domain User ID for anonymous customers. + - `ITEM_ID`: This will just be the product ID + - `TIMESTAMP`: We will use `derived_tstamp` as the most accurate timestamp we have; this also needs to be transformed into unix timestamp format + - `EVENT_TYPE`: We will map our Snowplow events to be either 'View' or 'Purchase' as appropriate + +With these training datasets in hand, we can now store them in Shaped.ai and start the setup for Shaped.ai models to ingest them. + +## Shaped.ai setup + +To proceed we need the following items: + +- An initial Dataset: the `snowplow_ecommerce_product_interactions` created as a Parquet file in the previous step, so we can easily import them +- A Model Descriptor, in the YAML format + +Shaped.ai has a very good Python library. Make sure to use Python version 3.11 (currently, their CLI does not work on newer Python versions). Install their CLI using the following command: + +```sh +$ pip install shaped +``` + +The next steps will require you to have full access to Shaped.ai, as creating API Keys will be necessary. + +To import your Parquet file into Shaped.ai, use the following commands: + +```sh +$ shaped init --api-key your-shaped-ai-api-key-here +$ shaped create-dataset-from-uri --name recommendations_ecomm_model_interactions --path ./your-exported-file.parquet --type parquet +``` + +Make sure the datasets were imported successfully, checking each Dataset through Shaped.ai UI. After this, you should create a model using a YAML file. One suggestion for this file contents would be as follows: + +```yml +# Specify your unique model name and any configurations. +model: + name: testing_snowplow_model + description: A model to test Snowplow data integration. + +# Specify the datasets that your model needs. +connectors: + - id: recommendations_ecomm + name: recommendations_ecomm + type: Dataset + - id: recommendations_ecomm_model_interactions + name: recommendations_ecomm_model_interactions + type: Dataset + +fetch: + events: | + SELECT + DOMAIN_USERID as user_id, + PRODUCT_ID as item_id, + 1 AS label, + ECOMMERCE_ACTION_TYPE as event, + ECOMMERCE_ACTION_TYPE as event_value, + DERIVED_TSTAMP as created_at + FROM recommendations_ecomm_model_interactions +``` + +Supposing the file was saved as `sample_model_definition.yaml`, the model can be created with this command: + +```sh +$ shaped create-model --file sample_model_definition.yaml +``` + +If all goes well, Shaped.ai will show a model being created in their UI. After the model is created, we can use Shaped.ai API to fetch pre-calculated recommendation scores! \ No newline at end of file