
Security News
MCP Steering Committee Launches Official MCP Registry in Preview
The MCP Steering Committee has launched the official MCP Registry in preview, a central hub for discovering and publishing MCP servers.
Semantic Link Labs is a Python library designed for use in Microsoft Fabric notebooks. This library extends the capabilities of Semantic Link offering additional functionalities to seamlessly integrate and work alongside it. The goal of Semantic Link Labs is to simplify technical processes, empowering people to focus on higher level activities and allowing tasks that are better suited for machines to be efficiently handled without human intervention.
If you encounter any issues, please raise a bug.
If you have ideas for new features/functions, please request a feature.
Check out the video below for an introduction to Semantic Link, Semantic Link Labs and demos of key features!
Check out the helper notebooks for getting started! Run the code below to load all the helper notebooks to the workspace of your choice at once.
import sempy_labs as labs
import requests
workspace_name = None # Update this to the workspace in which you want to save the notebooks
api_url = "https://api.github.com/repos/microsoft/semantic-link-labs/contents/notebooks"
response = requests.get(api_url)
files = response.json()
notebook_files = {file['name'][:-6]: file['html_url'] for file in files if file['name'].endswith('.ipynb')}
for file_name, file_url in notebook_files.items():
labs.import_notebook_from_web(notebook_name=file_name, url=file_url, workspace=workspace_name)
%pip install semantic-link-labs
import sempy_labs as labs
import sempy_labs.lakehouse as lake
import sempy_labs.report as rep
from sempy_labs import migration, directlake, admin, graph, theme, mirrored_azure_databricks_catalog, ml_model, variable_library
from sempy_labs.tom import connect_semantic_model
from sempy_labs.report import connect_report
An even better way to ensure the semantic-link-labs library is available in your workspace/notebooks is to load it as a library in a custom Fabric environment. If you do this, you will not have to run the above '%pip install' code every time in your notebook. Please follow the steps below.
The following process automates the migration of an import/DirectQuery model to a new Direct Lake model. The first step is specifically applicable to models which use Power Query to perform data transformations. If your model does not use Power Query, you must migrate the base tables used in your semantic model to a Fabric lakehouse.
Check out Nikola Ilic's terrific blog post on this topic!
Check out my blog post on this topic!
[!NOTE] The first 4 steps are only necessary if you have logic in Power Query. Otherwise, you will need to migrate your semantic model source tables to lakehouse tables.
[!NOTE] Calculated tables are also migrated to Direct Lake (as data tables with their DAX expression stored as model annotations in the new semantic model). Additionally, Field Parameters are migrated as they were in the original semantic model (as a calculated table). Auto date/time tables are not migrated. Auto date/time must be disabled in Power BI Desktop and proper date table(s) must be created prior to migration.
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.
When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.
python -m venv venv
.\venv\Scripts\Activate.ps1
pip install build
When making changes, always create a new branch.
Running the following in the terminal in Visual Studio Code will create a .whl file in the 'dist' folder within your locally-cloned repository.
python -m build
We use black formatting as a code formatting standard. Make sure to run 'black' formatting on your code before submitting a pull request.
Run this code to install black
pip install black==25.1.0
Run this code to format your code using black
python -m black src
This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.
FAQs
Semantic Link Labs for Microsoft Fabric
We found that semantic-link-labs demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
The MCP Steering Committee has launched the official MCP Registry in preview, a central hub for discovering and publishing MCP servers.
Product
Socket’s new Pull Request Stories give security teams clear visibility into dependency risks and outcomes across scanned pull requests.
Research
/Security News
npm author Qix’s account was compromised, with malicious versions of popular packages like chalk-template, color-convert, and strip-ansi published.