Blog

  • Design for usage optimization

    Different services come with different features and price points. After you pick a plan, don’t let those features go to waste. Find ways to use them fully and get your money’s worth. Also, keep an eye on your billing models. It’s smart to check if there’s a better billing model that fits how you’re actually using the service.

    Example scenario

    Contoso University hosts a commercial off-the-shelf (COTS) system that helps faculty manage courses and lets students register. It’s connected to a cloud-based education management system that they plan to fully switch to in a few years. For now, they want to optimize costs on the custom integration parts.

    The technology solution of the COTS offering is generally treated like a black box, except for its database, which runs on Azure Database for MySQL. The custom integration is an Azure durable function that runs fanned out on a Standard Azure App Service plan that used to host the university’s website, but doesn’t anymore. The durable function is a Python app that uses Azure Storage. It syncs data every night from the MySQL database to the cloud-based API.

    Use the full value of your resources

    Buy only what you need, and use everything that you’re paying for.

    Some resource SKUs come with built-in features for performance, security, or reliability. If you’re paying for them, make sure you’re using them. And if you don’t need those features, pick a simpler SKU to save money.

    Contoso’s challenge

    • The durable function runs on a Standard App Service plan that was originally sized for a public website, but that website has since been retired.
    • The team never re-evaluated the SKU, so they’re still paying for features and capacity that they don’t use.
    • They’re unsure which features are actually needed for the integration workload.

    Applying the approach and outcomes

    • The team reviews the current App Service plan and concludes that the integration doesn’t require the same level of scalability or performance and can be supported by a lower-tier configuration.
    • They move the function to a lower-tier plan that still supports durable functions but costs much less.
    • They also check their MySQL SKU and confirm that it’s rightsized for the current workload.
    • These changes help them reduce costs without affecting performance or reliability.

    Optimize your high availability design

    Prioritize deployment of active-active or active-only over active-passive models, as part of your recovery plan, if you already paid for the resources.

    If your design defaults to using active-passive models, you might have idle resources that could otherwise be used. Converting to active-active might enable you to meet your load leveling and scale bursting requirements without overspending. If you can meet your recovery targets with an active-only model, the costs of those resources can be removed completely.

    Contoso’s challenge

    • The COTS application uses Azure Database for MySQL Flexible Server configured for same-zone high availability, which provides a standby server in the same availability zone as the primary server. They also have enabled automatic backups.
    • The workload’s recovery point objective (RPO) is relatively long at 12 hours, and the recovery time objective (RTO) is three hours during the school day.
    • Based on previous recovery tests, the team knows that they can meet their RPO and RTO targets through automatic failover to the standby server. They have also tested recovering the database from a backup and they can meet the targets in this scenario.

    Applying the approach and outcomes

    • The workload team reevaluates the benefit of the high availability design versus the cost of the service being twice as much as a single instance.
    • The team tests building a new instance and recovering a database from backup and they’re satisfied that they will still be in compliance with their recovery targets, so they decide to eliminate the standby instance.
    • The team updates the disaster recovery plan to reflect the new recovery strategy and realize the cost savings through the new configuration.

    https://lernix.com.my/isaca-certification-training-courses-malaysia

  • Design with a cost-efficiency mindset

    Every architectural decision affects your budget, such as whether you build or buy, what tools you use, or how you license and train. It’s important to weigh those options and make trade-offs that still meet your app’s needs without overspending.

    Example scenario

    Contoso Manufacturing runs a custom-built warehouse management system (WMS) that handles its four warehouses across South America. They want to update and move the WMS to the cloud. They’re deciding between a lift-and-shift move of the current solution or a green field build with modern cloud tools. Leadership wants to keep costs under control, so the team needs a plan that maintains cost efficiency.

    The WMS solution is a .NET application that runs on Internet Information Services (IIS) and uses SQL Server for its databases.

    Understand the full cost of your design

    Measure the total cost incurred by technology and automation choices, taking into account the impact on return on investment (ROI). The design must work within the acceptable boundaries for all functional and nonfunctional requirements. The design must also be flexible to accommodate predicted evolution. Factor in the cost of acquisition, training, and change management.

    Implementing a balanced approach that takes ROI into account prevents over-engineering, which might increase costs.

    Contoso’s challenge

    • The engineering team at Contoso is excited to move their warehouse system to the cloud, just like other teams have done.
    • They know the current app has some technical debt, so they’re planning to rewrite much of the application code and switch to newer cloud-native tools.
    • The engineering team wants to redesign everything into microservices and run it on Azure Kubernetes Service (AKS), which is a new and exciting platform for them.

    Applying the approach and outcomes

    • The team is excited about doing a significant redesign during the cloud move, but they know that they need to maintain the workload’s ROI. So they must stick with tools that they already know and avoid major rewrites that require extra engineering team training.
    • The workload team takes a practical approach to designing the system. They want it to be cost-effective, meet expectations, and avoid overcomplicating things. To keep the ROI in check and make the migration smooth, they decide to go with an equivalent solution in the cloud, such as Azure App Service.
    • They establish a cost baseline that accounts for infrastructure, licensing, and operational costs, as well as less obvious factors like training for new platforms, rewriting legacy code, and managing change across teams. They gain a clearer picture of what’s feasible within their budget, which confirms their decision of App Service as the more familiar, lower-risk path.
    • During the migration, the team plans to clean up some of the technical debt that makes sense to tackle now. That way, after everything’s running on Azure, they’ll be in a better spot to keep improving the platform while still keeping the ROI in mind when making those choices.

    Refine the design

    Fine-tune the design by prioritizing services that can reduce the overall cost, don’t need additional investment, or don’t have a significant impact on functionality. Prioritization should account for the business model and technology choices that bring high ROI.

    You can explore cheaper options that might enable resource flexibility or dynamic scaling, or you might justify the use of existing investments. The prioritization parameters might factor in costs that are required for critical workloads, runtime, and operations, and other costs that might help the team work more efficiently.

    Contoso’s challenge

    • The existing workload is hosted on a hyper-converged (HCI) appliance and the team’s cost center is charged back for compute, network, and storage costs.
    • The workload has deployed the preproduction and production environments on Windows virtual machines.
    • GitHub Actions with self-hosted runners is used for running GitHub Actions jobs.

    Applying the approach and outcomes

    • After evaluating several cloud-native options, the team decides that moving the web components to App Service would provide Windows IIS application compatibility without significant changes and wouldn’t require significant training.
    • The team decides to continue using GitHub Actions with self-hosted runners, but they’ll migrate to a virtual machine scale set with the ability to scale to zero nodes when they aren’t being used.

    Design your architecture to support cost guardrails

    Set up cost limits in your architecture to keep spending within a safe range, and ensure that your cloud environment costs are kept under those limits.

    Enforcing limits helps avoid surprise charges and ensures that you only use what you actually budget for.

    Contoso’s challenge

    • The current system doesn’t have cost guardrails, but since it rarely changes, no one’s pushed to add them.
    • The HCI environment owners have set a resource cap, so the workload can’t use more compute or storage than allowed.
    • The team’s worried that moving to the cloud could lead to unexpected costs, and they’re not sure how to avoid that.

    Applying the approach and outcomes

    • The team learns how to use Microsoft Cost Management solutions.
    • They plan to set scale limits for the App Service plans.
    • They plan to set up a deny policy to block certain expensive virtual machine SKUs from being used.
    • They plan to add automation to save on storage. Older or less-used data will automatically move to cheaper storage tiers like cold or archive. This kind of automation wasn’t possible in their old HCI environment.

    https://lernix.com.my/iso-iec-20000-certification-training-courses-malaysia

  • Develop cost-management discipline

    Help your team get comfortable thinking about budgets, spending, and tracking costs. Cost optimization happens at different levels of the organization. So it’s important to understand how your workload fits into the bigger picture and supports company goals and FinOps practices. Having visibility into how resources are organized and how financial policies are applied helps you manage your workload in a consistent, efficient way.

    Example scenario

    Contoso organizes and hosts trade shows. They want to improve how they sell tickets and decide to build a mobile app in-house. The following scenarios walk through how they go from idea to launch, with a focus on making smart cost decisions along the way. The web app is written in .NET, hosted on Azure App Service, and uses Azure SQL Database for its database.

    Develop a cost model

    Before you can track spending properly, you must build a basic cost model.

    A cost model gives you a clearer picture of what things might cost, like infrastructure, support, and setup. It also helps you identify what’s driving those costs early on to estimate how changes in usage could affect your budget and revenue over time.

    Contoso’s challenge

    • Contoso wants to build a mobile app to handle ticket sales for their trade shows, but they’re not sure what it’ll cost, especially because demand can spike.
    • They plan to start small and grow, but without a cost model, it’s tough to get funding or plan ahead.

    Applying the approach and outcomes

    • The team maps out different cost scenarios based on the resources they’d need and how usage might grow. They explore a few setups that could handle different traffic levels to get a sense of what their Azure costs might be now and later on.
    • They combine rough estimates for infrastructure, team costs, and expected revenue to build a starting model.
    • This model helps them predict costs over time as usage increases and gives them a tool that they can keep refining as they make more decisions.

    Set a realistic budget

    Make sure your budget covers everything that you have to include, like key features, support, training, and room to grow.

    After you set a budget, you can set spending limits and get alerts if you’re about to go over budget for a specific resource or the whole project.

    Contoso’s challenge

    • In this scenario, the app is in the design phase and Contoso picked out the basic resources that they need.
    • Contoso needs to figure out their budget for the mobile ticketing workload.
    • Without a solid budget, they risk running out of money, wasting it on things that they don’t need yet, delaying the project timeline, or even putting the entire workload at risk.

    Applying the approach and outcomes

    • As the team refines their cost model, they come up with a confident budget that they can share with stakeholders.
    • This budget gives their architect a clear financial target to design around. As more is learned about the implementation and the operations necessary, the workload team expects to need to renegotiate budget a bit so they leave a small buffer.
    • The goal is to stay flexible but stick to the budget as closely as possible.

    Encourage upstream communication

    Encourage upstream communication from architects to application owners.

    When your organization makes budget adjustments, real-world learnings from production feedback are just as important as the numbers.

    Contoso’s challenge

    • Contoso’s mobile ticketing app is live and working well.
    • After reviewing how it’s being used, the team realizes that it could be more cost-efficient.
    • Since project management and finance seem happy with the results, they’re unsure if it’s worth bringing up.

    Applying the approach and outcomes

    • The team is encouraged to treat the budget like it’s their own and speak up to product management when they see a better way to meet the app’s needs without sacrificing security, reliability, or performance.
    • The workload team shares their ideas with stakeholders, and they talk about the pros and cons of making changes.
    • The changes are approved, and the savings follow.

    https://lernix.com.my/istqb-software-testing-certification-training-courses-malaysia

  • Safe and responsible AI for the public sector

    In the video, Elizabeth Emanuel, Senior Corporate Counsel for Worldwide Public Sector at Microsoft, explains the importance of meeting legal compliance when public sector organizations adopt generative AI policies. As Elizabeth says: “Public sector organizations have a high bar of legal compliance that must be met to ensure the use of AI reflects the values and priorities required for public trust.”

    Public sector organizations should create and adhere to conscientious AI strategies and integrate these approaches into guiding principles, operational practices, tools, and governance.

    This process might involve:

    • Developing new or adopting existing policies and guidelines
    • Providing training for staff to ensure they’re aware of the considerations associated with AI
    • Ensuring that data used to train AI models is representative and assessed for bias
    • Establish governance bodies to subject sensitive use cases to particularly high scrutiny and use tooling and telemetry to ensure that they’re functioning as intended and not causing unintended harm
    • Ensuring accountability for both the development and operation of AI capabilities and AI-enabled systems

    One tool that helps public sector organizations implement AI responsibly is the Azure AI Content Safety Studio. Content Safety Studio uses AI to create safer online spaces by classifying harmful content into four categories:

    • Hate
    • Sexual
    • Self-harm
    • Violence

    The AI models detect these types of content and assign a severity score. Based on the severity score, content is surfaced and actions are assigned.

    Public sector organizations using Content Safety Studio can prioritize what content moderators review with end goals of:

    • Managing and analyzing user-generated content
    • Ensuring compliance with guidelines
    • Maintaining a safe online environment

    Responsible AI should be ingrained as standard practice—not an afterthought—for public sector organizations. By doing so, they can use AI effectively to enhance services and ultimately help society.

    https://lernix.com.my/itil-certification-training-courses-malaysia

  • Accelerate discovery with generative AI

    Generative AI can benefit public sector organizations by helping them quickly understand and stimulate complex situations and processes, especially in the following areas:

    • Helping public sector agencies overcome cybersecurity challenges.
    • Performing predicative modeling, forecasting and stimulating.
    • Advancing scientific discoveries.

    Let’s explore how public sector agencies can use generative AI to accelerate the discovery of cyber threats and improve their security posture.

    Improve security posture

    Public sector organizations are often targets for cyberattacks thanks to the amount of sensitive and classified information many of them handle daily. A strong cybersecurity posture is critical to national security. Generative AI can help agencies overcome some of the many security challenges they face.

    In the following video, Sara Nagy, Senior Director of Customer Engagement at Microsoft, explains four ways in which public sector organizations can use generative AI to help improve their security posture.

    https://learn-video.azurefd.net/vod/player?id=2075b5ae-5748-4e23-b5ba-5576ad643481&locale=en-us&embedUrl=%2Ftraining%2Fmodules%2Fenhance-public-sector-services-generative-ai%2Faccelerate-discovery-generative-ai

    When public sector organizations adopt this comprehensive approach to threat management, they can swiftly adapt to evolving security challenges. They also empower security professionals of all expertise levels with valuable insights that can help them perform their roles better.

    https://lernix.com.my/java-ee-enterprise-edition-training-courses-malaysia

  • Augment cognition with generative AI

    Generative AI can benefit public sector organizations by increasing comprehension and learning and augmenting cognition, especially in the following areas:

    • Helping fraud investigators find evidence by extracting insights from data
    • Performing multimodal image analysis by gathering insights quickly from open-source intelligence with AI processing
    • Creating knowledge hubs to organize repositories, surface insights, and empower teams to find information more efficiently

    Augmented cognition through AI provides a digital sidekick that helps employees think better and handle complex tasks easily.

    Fraud investigations

    Fraud investigators already use sophisticated tools and lead in AI adoption. However, fraud investigators face considerable challenges because today’s models have limitations, like the ability to consider unstructured, non-numerical data or lack of adaptability due to the need for frequent manual updates.

    In the following video, Sara Nagy, Senior Director of Customer Engagement at Microsoft, explains how generative AI offers several unique capabilities that complement traditional solutions for public sector fraud investigators.

    https://learn-video.azurefd.net/vod/player?id=14861f33-76ee-4b11-83e5-0e3c192aa4d7&locale=en-us&embedUrl=%2Ftraining%2Fmodules%2Fenhance-public-sector-services-generative-ai%2Faugment-cognition-generative-ai

    Multimodal image analysis

    Generative AI can help public sector organizations unlock value through analysis of images like security camera footage, optical sensors, handheld devices, and other image-capturing technology.

    Understanding complex visual data can be difficult due to the following potential situations:

    • Images from a single modality might not provide sufficient information for accurate analysis
    • Images acquired under different conditions (for example, lighting, resolution, or viewpoint) can introduce variability
    • A single imaging modality might not be able to capture certain features or aspects of interest
    • Images in different formats or from different modalities can require unique and costly processing methods
    • Explore the tools needed to support this scenario: Multimodal image analysis with Azure OpenAI Service (PDF)
    • Reference architecture: Multimodal image analysis (PDF)

    Create knowledge hubs

    When public sector employees are accessing internal resources and information, they need to be able to find relevant information easily. When they can’t find the information they’re looking for, it can be frustrating. This frustration in turn places an added burden on internal staff, like HR, to respond to requests that could be self-serviced.

    Other challenges employees experience might include:

    • Inconsistent document management and organization practices
    • Lack of integration between different information repositories or the use of legacy technologies
    • Limited training for employees on available resources and processes
    • Discrepancies in document formats, naming conventions, and indexing

    Generative AI can help organize knowledge hubs by letting users search for information using natural language and then providing quick responses and access to data. This capability helps increase satisfaction in the workplace and frees up employees’ time with a repository of easily accessible information

    https://lernix.com.my/dynamics-365-marketing-training-courses-malaysia

  • Enhance productivity with generative AI

    One of the ways that AI can make a significant impact in the public sector is by enhancing productivity. When public sector organizations use AI, they can empower the workforce and open up new avenues of efficiency and creativity in areas like:

    • Simplifying case management by helping case workers manage heavy workloads with AI-generated drafts, insights, and automation
    • Simplifying rule making and accelerating the process by using AI to help draft and refine policies, gather public comments, and then analyze the comments
    • Empowering constituents to engage with public sector organizations more effectively, which results in shorter wait times, faster resolution, and informed support
    • Helping IT teams migrate legacy apps by documenting their functionalities, translating legacy code, and creating translations of app functions

    Let’s explore each one of these use cases.

    Simplify case management

    In the following video, Sara Nagy, Senior Director of Customer Engagement at Microsoft, explains how generative AI can enhance productivity for case workers by managing heavy workloads with AI-generated drafts, insights, and automation.

    https://learn-video.azurefd.net/vod/player?id=edaa553a-236f-4e42-ba8a-a456d7daf0a1&locale=en-us&embedUrl=%2Ftraining%2Fmodules%2Fenhance-public-sector-services-generative-ai%2Fenhance-productivity-generative-ai

    Simplify rulemaking

    Rulemaking is a long, complex, and costly process that includes several challenges. For example:

    • Writing and understanding the style and language of regulatory text can be difficult
    • Public participation can be a laborious process with many stakeholders
    • Rulemaking can involve contentious topics with conflicting interests and the possibility of litigation
    • Public employees need to maintain inclusive, fair, and equal access across both marginalized and powerful groups

    AI can help enhance employee productivity around rulemaking in the drafting phase. Teams can use an AI assistant to create initial drafts with natural language prompts and feedback. When the draft undergoes refinement, which involves many internal stakeholders like subject matter experts and legal counsel, the AI assistant can help flag potential errors. Then, during the public comment review stage, different stakeholders can engage with an AI assistant to understand and provide feedback on the proposed rule. Lastly, after gathering public comments, public affairs staff can use generative AI in the comment analysis phase to summarize the comments and find trends, concerns, and insights to inform their decision making.

    Empower constituents

    Generative AI can help constituents engage with public sector organizations more effectively by:

    • Decreasing call center wait times and hang-up rates
    • Offering constituents quicker options to connect through chat or email support
    • Making constituent data and information more accessible for agency support agents
    • Regaining trust with constituents who might already associate public sector organizations with poor support experiences

    AI-powered conversational assistants use natural language and contextual understanding to comprehend and interpret user inputs with high accuracy.

    The assistant also uses semantic search to search concepts and meaning when the user asks a question. The conversational continuity of the assistant provides a longer dialog and helps the user feel more at ease, but—more importantly—the use of the AI-powered assistant frees up human employees to focus on more complex tasks.

    Migrate legacy applications

    Many public sector organizations run legacy applications, which in today’s technology environment, can cause problems for an organization and its employees. Problems with managing legacy applications might include:

    • Shortage of skilled programmers
    • Dependence on outdated hardware
    • Scalability challenges
    • Knowledge transfer
    • Compatibility issues
    • Limited vendor support
    • Security concerns
    • Regulatory compliance

    Generative AI can help public sector organizations modernize legacy applications and outdated code. AI can help public sector organizations migrate legacy algorithms and functions using AI in the following areas.

    Grounding

    Developers submit examples of the current code to the generative AI model. The AI model learns the pattern and syntax.

    Assessment

    Generative AI reviews the code and helps document its functionalities to determine which part of the code can be migrated.

    Initial drafting

    Developers make initial translations of the legacy algorithms and functions and assess the baseline performance. If developers see flaws, they might choose other sample code and go back to step one and ground the model on the new material.

    Development and iteration

    The development team uses AI to create a first draft, which the development team reviews and edits. Generative AI might help refactor the new code to improve quality and efficiency.

    Once the code is complete and thoroughly tested, the legacy application can be retired, and the new cloud-based application can go live.

    https://lernix.com.my/java-programming-training-courses-malaysia

  • AI in the public sector

    Generative AI has the ability to revolutionize the missions of public sector organizations and transform many aspects of our everyday work and life.

    Generative AI holds a lot of potential for helping public sector organizations accomplish their missions. But it’s important to understand what generative AI can and can’t do so you can identify appropriate areas and ways where the technology can have the highest impact. The four main capabilities of generative AI are:

    • Content creation: Creating a human-like output, including textual, visual, or multimedia content based on input data or natural language prompts
    • Summarization: Extracting key themes and insights from a longer piece of text, including answering natural language queries
    • Code production: Generating code based on a prompt, translating code from one programming language to another, or reviewing and improving existing code
    • Semantic search: Going beyond traditional keyword matching by understanding the meaning behind a query and retrieving relevant search results that are semantically related to the user’s intent

    In public sector organizations, there are three key areas where generative AI can drive transformation:

    • Enhanced productivity
    • Augmented cognition
    • Accelerated discovery

    Enhanced productivity

    Enhanced productivity through Al can enrich employee experiences by helping to unlock the best out of workforce talent. From content summarization to automating routine tasks, Al can positively affect productivity by reducing tedious tasks.

    Augmented cognition

    Another way Al use can help an organization is through augmented cognition. Augmented cognition provides teams with a digital sidekick that helps them think better and handle complex tasks with technology’s help. Data collection and extraction, intel analysis and interpretation, and anomaly detection are some of the ways Al can help increase a team’s productivity.

    Accelerated discovery

    Lastly, Al can empower every organization to bend the curve on innovation by modernizing internal processes, accelerating the discovery process, and understanding and simulating complex situations and processes.

    This module explores each of these areas by sharing the following use cases and highlighting technologies and tools to support this revolution.

    • Simplifying case management
    • Simplifying rulemaking
    • Empowering constituents
    • Migrating legacy apps
    • Assisting fraud investigations
    • Employing multimodal image analysis
    • Creating knowledge hubs
    • Improving security posture

    https://lernix.com.my/oracle-java-training-courses-malaysia

  • Create custom AI models with Azure Machine Learning

    The availability of sophisticated AI models can help organizations reduce significantly the intimidating amount of resources a data science project can require. Let’s see how organizations can tackle machine learning challenges and operations with Azure Machine Learning.

    Machine learning challenges and machine learning operations

    Maintaining AI solutions typically requires machine learning lifecycle management to document and manage data, code, model environments, and the machine learning models themselves. You need to establish processes for developing, packaging, and deploying models, as well as monitoring their performance and occasionally retraining them. And most organizations are managing multiple models in production at the same time, adding to the complexity.

    To cope effectively with this complexity, some best practices are required. They focus on cross-team collaboration, automating and standardizing processes, and ensuring models can be easily audited, explained, and reused. To get this done, data science teams rely on the machine learning operations approach. This methodology is inspired by DevOps (development and operations), the industry standard for managing operations for an application development cycle, since the struggles of developers and data scientists are similar.

    Azure Machine Learning

    Data scientists can manage and execute machine learning DevOps from Azure Machine Learning, a platform by Microsoft to make machine learning lifecycle management and operations practices easier. Such tools help teams collaborate in a shared, auditable, and safe environment where many processes can be optimized via automation.

    Screenshot of azure Machine Learning Studio landing page.

    Machine learning lifecycle management

    Azure Machine Learning supports end-to-end machine learning lifecycle management of pretrained and custom models. The typical lifecycle includes the following steps: data preparation, model training, model packaging, model validation, model deployment, model monitoring and retraining.

    Diagram showing the ML lifecycle: prepare data, train model, package model, validate model, deploy mode, monitor model, and retrain model.

    The classic approach covers all the usual steps of a data science project.

    1. Prepare dataset. AI starts with data. First, data scientists need to prepare data with which to train the model. Data preparation is often the biggest time commitment in the lifecycle. This task involves finding or building your own dataset and cleaning it so it’s easily readable by machines. You want to make sure the data is a representative sample, that your variables are pertinent for your goal, and so on.
    2. Train and test. Next, data scientists apply algorithms to the data to train a machine learning model. Then they test it with new data to see how accurate its predictions are.
    3. Package. A model can’t be directly put into an app. It needs to be containerized, so it can run with all the tools and frameworks its built on.
    4. Validate. At this point, the team evaluates how model performance compares to their business goals. Testing may return good enough metrics, but still the model may not work as expected when used in a real business scenario.
      • Repeat steps 1-4. It can take hundreds of training hours to find a satisfactory model. The development team may train many versions of the model by adjusting training data, tuning algorithm hyperparameters, or trying different algorithms. Ideally the model improves with each round of adjustment. Ultimately, it’s the development team’s role to determine which version of the model best fits the business use case.
    5. Deploy. Finally, they deploy the model. Options for deployment include: in the cloud, on an on-premises server, and on devices like cameras, IoT gateways, or machinery.
    6. Monitor and retrain. Even if a model works well at first, it needs to be continually monitored and retrained to stay relevant and accurate.

    https://lernix.com.my/jboss-training-courses-malaysia

  • Develop AI solutions with Azure AI Services

    This unit discusses the prebuilt AI models that are available in Azure AI Services. They are a solid alternative to developing internal custom AI models.

    What is Azure AI Services?

    When considering adopting AI into your business, you should consider prebuilt AI services first. Azure AI Services is a Microsoft product that delivers AI as SaaS. It includes pretrained models developed by Microsoft global researchers and data scientists to solve common problems. To avoid reinventing the wheel, businesses can leverage prebuilt services to achieve quality and accelerate delivery of technology solutions.

    It’s better to use the Azure AI Services that offer prebuilt AI services in vision, speech, language, search, or generative AI to solve common scenarios. This brings AI within reach of every developer and organization without requiring machine learning expertise. As a result, it enables developers of all skill levels to easily add intelligence to new or existing business applications.

    Using Azure AI Services can:

    • Save costs: Since AI Services is serverless, they’re usually less costly than developing and training custom models from scratch internally.
    • Give deployment flexibility: You can export AI Services models and run them wherever you need, in the cloud, on-premises, or on the edge.
    • Provide enterprise-level security: AI services provide a layered security model, including authentication with Microsoft Entra credentials, a valid resource key, and Azure Virtual Networks.
    • Connect to an ecosystem of products: AI services are part of a broad ecosystem that includes automation and integration tools, deployment options, Docker containers for secure access, and tools for big data scenarios.

    Azure AI Services capabilities

    Azure AI capabilities include: vision, language, speech, document intelligence, search, and generative AI. You can build solutions with these capabilities using a suite of Azure AI services, including:

    • Azure AI Vision: includes models that analyze images and videos. Beside more generic models, there are specific ones for extracting text from images (optical character recognition or OCR), for recognizing human faces. Another option is Azure Custom Vision, which lets users build their own AI models to recognize objects or classify images. Keep in mind that face recognition services are highly restricted due to Microsoft responsible AI policies.
    • Azure AI Language: focuses on processing and analyzing text. They’re trained to understand natural language and extract insights. For example, models recognize language, intent, entities, and sentiment in a text. Besides, they can find answers to the questions put to them.
    • Azure AI Speech: provides models that deal with oral conversation. They can transform speech to text and vice-versa. It’s also possible to translate what the speaker says and identify each speaker. Models can even suggest pronunciation corrections to the speakers.
    • Azure AI Document Intelligence: incorporates OCR and text analytics models to extract data from invoices, receipts, and other documents. Document intelligence relies on machine learning models that are trained to recognize data in text.
    • Azure AI Search: provides secure information retrieval at scale over user-owned content in traditional and generative AI search applications. Azure AI Search can index unstructured, typed, image-based, or hand-written media. The indexes can be used for internal only use, or to enable searchable content on public-facing internet assets.
    • Azure OpenAI Service: enables users to leverage generative AI models via Azure AI Services. In other words, it allows you to access OpenAI models directly from Azure instead of the public API. Keep in mind that Azure OpenAI Service isn’t the only Microsoft product delivering this kind of models to users. In previous units, we’ve already discussed generative AI included in Microsoft Copilot for Microsoft 365 and Copilot in Power Platform. These copilot features are powered by GPT, an OpenAI model for text generation.
    • Azure AI Foundry: a Microsoft cloud platform that brings together multiple Azure AI-related services into a single, unified development environment. Developers can use these services to build end-to-end AI solutions. Specifically, Azure AI Foundry combines:
      • The model catalog and prompt flow development capabilities of Azure Machine Learning service.
      • The generative AI model deployment, testing, and custom data integration capabilities of Azure OpenAI service.
      • Integration with Azure AI Services for speech, vision, language, document intelligence, and content safety.

    https://lernix.com.my/juniper-networks-training-courses-malaysia