Category: Uncategorized

  • Understand Continuous Integration and Continuous Delivery (CI/CD)

    When you and members of your team are each responsible developing and maintaining different parts of your Fabric environment, a best practice is to work in isolated development environments until you’re ready to combine your development efforts and publish your changes to a particular pre-production environment. When you’re ready to publish your changes, you need to make sure that your changes don’t break existing code or interfere with changes made by other developers. There’s also a need to ensure that code changes are saved and can be reverted if there are issues. The built-in continuous integration and continuous delivery capabilities in Fabric can help facilitate this.

    Continuous integration and continuous delivery is a process for integrating code contributions from multiple developers into a main codebase. Contributions are frequently committed, and automated processes build and test the new code. Code is continuously moving into production, reducing feature development time.

    Continuous integration

    If developers work on separate code branches on their local machines for long periods of time and only merge their changes to the main codebase once their work is finished, this increases the likelihood of conflicts and bugs that might only be identified in later development stages and can slow down delivery of features to users.

    Continuous integration (CI) helps you avoid bugs and code failures and lets you continuously develop and release functionality. In CI, you frequently commit code to a shared code branch or trunk in a version control system and once it’s merged, changes are validated by a build process and automated testing. Conflicts between new and existing code are identified earlier in the development process and are easier and faster to fix.

    Continuous delivery

    Continuous delivery happens after continuous integration. Once CI is complete, code is deployed to a staging environment where more automated testing is performed before code is released into production.

    Continuous deployment

    Continuous deployment is a process that automatically releases updates into production environments through structured deployment stages, once they pass automated tests.

    Use CI/CD in Fabric

    Managing the lifecycle of Fabric items using CI/CD has two parts: integration and deployment. Integration is implemented using Git. Deployment is implemented using Fabric deployment pipelines. Automation of deployment or integration is implemented using Fabric REST APIs.

    • Git: Lets your team collaborate using branches, and provides version control. It helps you manage incremental code changes, and see code history.
    • Deployment pipelines: Lets you promote code changes to different environments like development, test, and production.
    • Fabric REST APIs: Enables automation and lets you programmatically manage CI/CD processes.

    https://cosmicnext.com/mobile-app-development

  • Inclusive AI with Copilot

    Microsoft Copilot is changing the way you approach everyday tasks, making them more accessible to everyone.

    In this video, you discover how Copilot helps reducing barriers, providing writing assistance, and summarizing information in seconds. Explore real-world instances where Copilot creates text from short prompts, suggests full sentences, and generates descriptions and summaries of online material.

    https://go.microsoft.com/fwlink/?linkid=2296201

    Copilot was created with inclusive design in mind. Its voice commands and text generation help make technology accessible and empowering for individuals with disabilities.

    https://cosmicnext.com/managed-it-services

  • AI as an ally for humanitarian action

    As our world undergoes rapid transformations and faces numerous complex challenges, the need for effective humanitarian action has never been more urgent. AI is at the forefront, offering solutions to aid people in need.

    In this video, you uncover the power of AI in addressing some of the most pressing challenges faced by communities worldwide. From predicting natural disasters to optimizing the distribution of resources, AI is enabling more effective and timely responses to crises.

    https://go.microsoft.com/fwlink/?linkid=2296400

    Recent AI advancements can help tackle some of the most pressing global challenges. As AI progresses, our capacity to address significant climate and humanitarian challenges enhances.

    https://cosmicnext.com/loyalty

  • Impact of AI in job roles

    As AI continues to evolve, it isn’t only transforming how you perform tasks but also creating entirely new roles and processes that were previously unimaginable.

    This video examines AI’s progress in sectors like finance, healthcare, energy, manufacturing, retail, public sector, and agriculture. It covers AI’s role in fraud detection in finance, improving patient care in healthcare, boosting energy efficiency, and personalizing retail experiences. By the end of the video, you’ll have a better understanding of how AI is transforming job roles and fostering growth and innovation.

    https://go.microsoft.com/fwlink/?linkid=2296301

    AI is transforming industries by introducing new roles and methodologies that boost efficiency and foster innovation. Whether in finance, healthcare, energy, manufacturing, retail, the public sector, or agriculture, AI is catalyzing advancements and propelling growth. As you explore this environment, it’s important to be an agile learner, prepared to adapt and enhance your skills. Using AI’s capabilities enables you to accomplish more and reshape your professional journey.

    https://cosmicnext.com/it-support

  • The impact of AI on accessibility

    Today, AI advancements make technology more accessible for everyone. These innovations are breaking down barriers and creating new opportunities for individuals with diverse abilities. AI is proving to be a catalyst for inclusivity, enabling people to access information, communicate, and participate in various aspects of life more easily. Through these efforts, technology is becoming a powerful ally for inclusiveness and accessibility.

    In this video, you explore the impact of AI on accessibility, highlighting how new initiatives are breaking down barriers and creating new opportunities for people with diverse abilities. The projects highlighted in this video demonstrate AI’s potential to enhance inclusivity, making it easier for individuals with disabilities to access information, communicate, and participate in various aspects of life.

    https://go.microsoft.com/fwlink/?linkid=2296300

    From enhancing braille literacy in India to transforming employment opportunities for neurodivergent individuals, AI is at the forefront of making our digital world more inclusive.

    https://cosmicnext.com/it-consulting

  • Automate actions

    Activator is a technology in Microsoft Fabric that enables automated processing of events that trigger actions. For example, you can use Activator to notify you by email when a value in an Eventstream deviates from a specific range or to run a notebook to perform some Spark-based data processing logic when a Real-Time Dashboard is updated.

    Screenshot of an Activator alert in Microsoft Fabric.

    This image shows a rule configured to alert when package delivery failures happen, demonstrating how you can automate responses to specific business events.

    Understand Activator key concepts

    Activator operates based on four core concepts: EventsObjectsProperties, and Rules.

    • Events - Each record in a stream of data represents an event that has occurred at a specific point in time.
    • Objects - The data in an event record can be used to represent an object, such as a sales order, a sensor, or some other business entity.
    • Properties – The fields in the event data can be mapped to properties of the business object, representing some aspect of its state. For example, a total_amount field might represent a sales order total, or a temperature field might represent the temperature measured by an environmental sensor.
    • Rules – The key to using Activator to automate actions based on events is to define rules that set conditions under which an action is triggered based on the property values of objects referenced in events. For example, you might define a rule that sends an email to a maintenance manager if the temperature measured by a sensor exceeds a specific threshold.

    Use cases for Activator

    Activator can help you in various scenarios, such as dynamic inventory management, real-time customer engagement, and effective resource allocation in cloud environments. It’s a powerful tool for any circumstance that requires real-time data analysis and automated actions.

    Use Activator to:

    • Initiate marketing actions when product sales drop.
    • Send notifications when temperature changes could affect perishable goods.
    • Flag real-time issues affecting the user experience on apps and websites.
    • Trigger alerts when a shipment hasn’t been updated within an expected time frame.
    • Send alerts when a customer’s account balance crosses a certain threshold.
    • Respond to anomalies or failures in data processing workflows immediately.
    • Run ads when same-store sales decline.
    • Alert store managers to move food from failing grocery store freezers before it spoils.

    https://cosmicnext.com/infrastructure-services

  • Visualize real-time data

    Real-Time Dashboards provide a way to pin data visualizations to a single visual interface, enabling you to surface real-time insights at a glance. Each tile in a dashboard shows you different information based on a KQL query that extracts real-time data from tables in an eventhouse.

    Screenshot of a real-time dashboard in Microsoft Fabric.

    Create a Real-Time Dashboard

    You can create a Real-Time Dashboard in a workspace and then configure its source, or you can create one directly from a KQL queryset in an eventhouse.

    Dashboards are composed of one or more tiles, each containing a visualization based on a KQL query expression. By default, the visualization shows the results of the query as a table; but you can edit the tile to customize how the data is displayed.

    When published, tiles let you explore the data they contain interactively by drilling into the data and using a visual interface to filter and aggregate the data, and change the visualization type.

     Tip

    To learn more about Real-Time Dashboards, see Create a Real-Time Dashboard.

    Visualize real-time data with Power BI

    You also can create Power BI reports from your KQL database data.

    Screenshot of a Power BI report editor in Microsoft Fabric.

     https://cosmicnext.com

  • Store and query real-time data

    KQL databases in an Eventhouse are where you store and query real-time data that flows from Eventstreams and other streaming sources. Once data is loaded into tables, you can use the Kusto Query Language (KQL) or T-SQL to query your data.

    Screenshot of an Eventhouse in Microsoft Fabric.

    Within an eventhouse, you can create:

    • KQL databases: Real-time optimized data stores that host a collection of tables, stored functions, materialized views, shortcuts and data streams.
    • KQL querysets: Collections of KQL queries that you can use to work with data in KQL database tables. A KQL queryset supports queries written using Kusto Query Language (KQL) or a subset of the Transact-SQL language.

    Understand the power of Kusto Query Language (KQL)

    To query data in a table in a KQL database, you can use the KQL. KQL is specifically designed for analyzing large volumes of structured, semi-structured, and unstructured data with exceptional performance. KQL databases are optimized for time-series data and index incoming data by ingestion time and partition it for optimal query performance. KQL is the same language used in Azure Data Explorer, Azure Monitor Log Analytics, Microsoft Sentinel, and in Microsoft Fabric.

    Get familiar with KQL syntax

    KQL queries are made of one or more query statements. A query statement consists of a table name followed by operators that takefiltertransformaggregate, or join data. For example, to print any 10 rows in the stock table, execute:

    kqlCopy

    stock
    | take 10
    

    A more complex example might aggregate data to find average stock prices over the last 5 minutes:

    kqlCopy

    stock
    | where ["time"] > ago(5m)
    | summarize avgPrice = avg(todouble(bidPrice)) by symbol
    | project symbol, avgPrice
    

     Tip

    To learn more about KQL, see Kusto Query Language (KQL) overview.

    Automate data processing with management commands

    Beyond basic querying, you can automate data processing through management commands including:

    • Update policies: Automatically transform incoming data and save it to different tables as it arrives.
    • Materialized views: Precalculate and store summary results for faster queries.
    • Stored functions: Save frequently used query logic that you can reuse across multiple queries.

     Tip

    For more information about working with KQL databases, including detailed examples of update policies, materialized views, and stored functions, see Work with real-time data in a Microsoft Fabric Eventhouse.

    Other query options

    Using SQL

    KQL databases in Eventhouses also support a subset of common T-SQL expressions for data professionals already familiar with T-SQL syntax. For example:

    SQLCopy

    SELECT TOP 10 * FROM stock;
    

    Use Copilot to help with queries

    Microsoft Fabric includes Copilot for Real-Time Intelligence, which can help you write queries to extract insights from your Eventhouse data. Copilot uses AI to understand what you’re looking for and can generate the required query code.

    https://cosmicnext.com/hrms

  • Ingest and transform real-time data

    Real-Time Intelligence in Microsoft Fabric provides two primary approaches for ingesting streaming data: using eventstreams or directly ingesting data into a KQL database in an Eventhouse.

    Eventstreams for data ingestion and transformation

    Eventstreams are a way to bring real-time events into Fabric, to transform them, and then route data to a destination.

    Screenshot of an Eventstream in Microsoft Fabric.

    The image shows the three main components of an Eventstream: sources where data originates, transformations optional processing applied to the data, and destinations where the processed data is sent.

    Think of the Eventstream components like a water pipe system. The source is your faucet, transformations are filters along the way and you need a destination like a sink or bucket to collect and use the water.

    Next, let’s review each component of an Eventstream.

    Data sources for eventstreams

    Once you create an eventstream in Fabric, you can connect it to a wide range of data sources. You can stream data from Microsoft sources and also ingest data from non-Microsoft platforms including:

    • Microsoft sources, like Azure Event Hubs, Azure IoT Hubs, Azure Service Bus, Change Data Capture (CDC) feeds in database services, and others.
    • Azure events, like Azure Blob Storage events.
    • Fabric events, such as changes to items in a Fabric workspace, data changes in OneLake data stores, and events associated with Fabric jobs.
    • External sources, such as Apache Kafka, Google Cloud Pub/Sub, and MQTT (Message Queuing Telemetry Transport) (in preview)

     Tip

    To see all supported sources, see Supported sources.

    Event transformations in eventstreams

    Raw data from a source system is rarely in the exact format you need for analysis or storage. Transformations are what make your data useful and actionable. You can transform the data as it flows in an eventstream, enabling you to filter, summarize, and reshape it before storing it. Examples of available transformations include: SQL code, filter, manage fields, aggregate, group by, expand and join.

     Tip

    For more information about supported transformations, see Process event data with event processor editor and Process events using SQL code editor.

    Data destinations in eventstreams

    Streaming data flows continuously and is temporary by nature. It requires immediate processing and storage to retain its value. The destination in an eventstream is what makes your real-time data processing actionable. It’s where your processed data becomes available for queries, reports, dashboards, alerts, actions, or integration with other systems. You can load the data from your stream into the following destinations: a KQL database in an Eventhouse, Lakehouse, a derived stream, Fabric Activator, or a custom endpoint.

     Tip

    For more information about supported destinations, see Add and manage a destination in an eventstream.

    Direct ingestion to a KQL database in an Eventhouse

    Data can also be directly ingested into a KQL (Kusto Query Language) database in an Eventhouse. Some examples of data ingestion sources include: local files, Azure storage, Amazon S3, Azure Event Hubs, OneLake, and more. Data ingestion can be configured using connectors or through the Get data option in a KQL database as shown in this image.

    Screenshot of the get data option in a KQL database in an Eventhouse in Microsoft Fabric.

     Tip

    For more information about supported ingestion sources for KQL databases in Eventhouses, see Data sources and Data connectors overview.

    Data transformation in a KQL database in Eventhouse with update policies

    When directly ingesting data into a KQL database, data first lands in the database, then can be transformed using update policies. This is different from eventstream transformations that occur during stream processing, before routing data to a destination.

    Update policies are automation mechanisms triggered when new data is written to a table. They run a query to transform ingested data and save the result to a destination table.

    https://cosmicnext.com/financial

  • Real-Time Intelligence in Microsoft Fabric

    As organizations generate increasing volumes of event-driven data, the ability to process, analyze, and act on data in motion becomes essential for competitive advantage. Real-Time Intelligence provides comprehensive capabilities for working with streaming data with minimal latency.

    Explore Real-Time Intelligence use cases

    Unlike batch systems that process data on scheduled intervals, Real-Time Intelligence helps you respond to events as they happen, delivering near real-time insights.

    Here are some common types of event data and examples of how Real-Time Intelligence can support downstream actions and business responsiveness:

    • Delivery tracking: Monitor vehicle locations to alert customers when packages are delayed
    • Equipment monitoring: Track machine temperature to prevent costly breakdowns
    • Fraud detection: Analyze purchase patterns to block suspicious transactions immediately
    • Website performance: Monitor page load times to improve user experience
    • System health: Track application errors to maintain service reliability

    Real-Time Intelligence components

    Microsoft Fabric’s Real-Time Intelligence is an integrated set of components that work together to handle streaming data from capture through automated response.

    Diagram of Fabric Real-Time Intelligence capabilities.

    The diagram shows how Real-Time Intelligence components work together for end-to-end processing. Each component handles a specific stage of the real-time analytics process:

    Ingest and process data in motion with Eventstreams

    Data ingestion and processing can happen through Eventstreams, which capture streaming data from various sources and apply real-time transformations as data flows through the system. Eventstreams can filter, enrich, and transform your data and route it to different destinations.

    Store real-time data in an Eventhouse

    Real-Time Intelligence stores data in KQL (Kusto Query Language) databases in Eventhouses. These databases are designed for time-series data and fast ingestion of streaming data. The storage integrates with OneLake, making your data available to other Fabric tools.

    Analyze data with KQL Queryset

    KQL Queryset provides a workspace for running and managing queries against KQL databases. The KQL Queryset allows you to save queries for future use, organize multiple query tabs, and share queries with others for collaboration. The KQL Queryset also supports T-SQL queries, allowing you to use familiar SQL syntax alongside KQL for data analysis.

    Visualize insights with Real-Time Dashboard

    Real-Time Dashboards connect directly to KQL databases and refresh automatically as new data arrives. These dashboards let you explore data interactively and monitor both current conditions and historical trends.

    Act on data with Activator

    Automated actions can be configured with Activator, which continuously monitors streaming data against user-defined rules and thresholds. When conditions are met, Activator can send notifications, trigger workflows in Power Automate, execute Fabric data pipelines or notebooks, creating event-driven automation that responds to real-time conditions.

    Discover streaming data with the Real-Time hub

    The Fabric Real-Time hub is a central location where you can discover and manage all of the data-in-motion that you have access to. It gives you a way to ingest streaming data from Azure and from external sources and it lets you subscribe to Azure and Fabric events.

    Think of the Real-Time hub as your streaming data catalog where you can see what’s happening in near real-time across your organization. There are connectors you can use to ingest data into Microsoft Fabric from various sources. For example, you might connect to IoT sensor streams through Azure Event Hubs, subscribe to Azure Blob Storage events, use Change Data Capture (CDC) to stream database changes, or monitor Fabric workspace events.

    Once you have configured a connection to data source or event source, these items become the foundation for event driven decision making and a wide range of real-time analytics solutions, from building dashboards and setting up alerts to triggering automated workflows and analyzing trends in your data.

    Screenshot of Microsoft Fabric Real-Time hub.

    To access the real-time hub, select the Real-Time icon in the main Fabric menu bar.

    The real-time hub organizes data-in-motion into several main categories:

    • Data sources: Browse and connect to available streaming data sources, such as Microsoft sources, database change data capture feeds, and external sources from other cloud providers
    • Azure sources: Discover and configure Azure streaming data sources such as Azure IoT Hub, Azure Service Bus, Azure Data Explorer DB, and more
    • Fabric events: Subscribe to system-generated events in Fabric that you can access, like job status changes, events produced by action on files or folders in OneLake, and Fabric workspace item changes
    • Azure events: Subscribe to system events from Azure services that can be used to trigger automated responses such as actions on files or folders in Azure blob storage

    In the Real-Time hub, you can preview and explore your streaming data by navigating directly to eventstreams or KQL databases in eventhouses for deeper analysis and querying. You can also build automated responses using Activator rules that trigger actions like notifications, workflows, or data processing when specific patterns are detected.

    https://cosmicnext.com/feature