Skip to content

Commit bc7b3ae

Browse files
Integrated latest changes at 11-20-2025 10:30:04 AM
1 parent 75bc151 commit bc7b3ae

File tree

27 files changed

+1447
-37
lines changed

27 files changed

+1447
-37
lines changed

ej2-javascript/ai-assistview/speech/es5-speech-to-text.md

Lines changed: 20 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,18 @@ Before integrating `Speech-to-Text`, ensure the following:
2525

2626
## Configure Speech-to-Text
2727

28-
To enable Speech-to-Text functionality, modify the `index.js` file to incorporate the Web Speech API. The [SpeechToText](https://ej2.syncfusion.com/javascript/documentation/speech-to-text/es5-getting-started) control listens for microphone input, transcribes spoken words, and updates the AI AssistView's editable footer with the transcribed text. The transcribed text is then sent as a prompt to the Azure OpenAI service via the AI AssistView control.
28+
To enable Speech-to-Text functionality in the JavaScript AI AssistView control, update the `index.js` file to incorporate the Web Speech API.
29+
30+
The [SpeechToText](https://ej2.syncfusion.com/javascript/documentation/speech-to-text/es5-getting-started) control listens to audio input from the device’s microphone, transcribes spoken words into text, and updates the AI AssistView’s editable footer using the [footerTemplate](https://ej2.syncfusion.com/javascript/documentation/api/ai-assistview/#footertemplate) property to display the transcribed text. The transcribed text is then sent as a prompt to the Azure OpenAI service via the AI AssistView control.
31+
32+
### Configuration Options
33+
34+
* **[`lang`](https://ej2.syncfusion.com/javascript/documentation/api/speech-to-text/#lang)**: Specifies the language for speech recognition. For example:
35+
36+
* `en-US` for American English
37+
* `fr-FR` for French
38+
39+
* **[`allowInterimResults`](https://ej2.syncfusion.com/javascript/documentation/api/speech-to-text/#allowinterimresults)**: Set to `true` to receive real-time (interim) recognition results, or `false` to receive only final results.
2940

3041
{% tabs %}
3142
{% highlight js tabtitle="index.js" %}
@@ -38,6 +49,14 @@ To enable Speech-to-Text functionality, modify the `index.js` file to incorporat
3849

3950
{% previewsample "page.domainurl/code-snippet/ai-assistview/speech/stt" %}
4051

52+
## Error Handling
53+
54+
The `SpeechToText` control provides events to handle errors that may occur during speech recognition. For more information, refer to the [Error Handling](https://ej2.syncfusion.com/javascript/documentation/speech-to-text/speech-recognition#error-handling) section in the documentation.
55+
56+
## Browser Compatibility
57+
58+
The `SpeechToText` control relies on the [Speech Recognition API](https://developer.mozilla.org/en-US/docs/Web/API/SpeechRecognition), which has limited browser support. Refer to the [Browser Compatibility](https://ej2.syncfusion.com/javascript/documentation/speech-to-text/speech-recognition#browser-support) section for detailed information.
59+
4160
## See Also
4261

4362
* [Text-to-Speech](./es5-text-to-speech.md)

ej2-javascript/ai-assistview/speech/speech-to-text.md

Lines changed: 20 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,18 @@ Before integrating `Speech-to-Text`, ensure the following:
2525

2626
## Configure Speech-to-Text
2727

28-
To enable Speech-to-Text functionality, modify the `index.ts` file to incorporate the Web Speech API. The [SpeechToText](https://ej2.syncfusion.com/documentation/speech-to-text/getting-started) control listens for microphone input, transcribes spoken words, and updates the AI AssistView's editable footer with the transcribed text. The transcribed text is then sent as a prompt to the Azure OpenAI service via the AI AssistView control.
28+
To enable Speech-to-Text functionality in the TypeScript AI AssistView control, update the `index.ts` file to incorporate the Web Speech API.
29+
30+
The [SpeechToText](https://ej2.syncfusion.com/documentation/speech-to-text/getting-started) control listens to audio input from the device’s microphone, transcribes spoken words into text, and updates the AI AssistView’s editable footer using the [footerTemplate](https://ej2.syncfusion.com/documentation/api/ai-assistview/#footertemplate) property to display the transcribed text. The transcribed text is then sent as a prompt to the Azure OpenAI service via the AI AssistView control.
31+
32+
### Configuration Options
33+
34+
* **[`lang`](https://ej2.syncfusion.com/documentation/api/speech-to-text/#lang)**: Specifies the language for speech recognition. For example:
35+
36+
* `en-US` for American English
37+
* `fr-FR` for French
38+
39+
* **[`allowInterimResults`](https://ej2.syncfusion.com/documentation/api/speech-to-text/#allowinterimresults)**: Set to `true` to receive real-time (interim) recognition results, or `false` to receive only final results.
2940

3041
{% tabs %}
3142
{% highlight ts tabtitle="index.ts" %}
@@ -38,6 +49,14 @@ To enable Speech-to-Text functionality, modify the `index.ts` file to incorporat
3849

3950
{% previewsample "page.domainurl/code-snippet/ai-assistview/speech/stt" %}
4051

52+
## Error Handling
53+
54+
The `SpeechToText` control provides events to handle errors that may occur during speech recognition. For more information, refer to the [Error Handling](https://ej2.syncfusion.com/documentation/speech-to-text/speech-recognition#error-handling) section in the documentation.
55+
56+
## Browser Compatibility
57+
58+
The `SpeechToText` control relies on the [Speech Recognition API](https://developer.mozilla.org/en-US/docs/Web/API/SpeechRecognition), which has limited browser support. Refer to the [Browser Compatibility](https://ej2.syncfusion.com/documentation/speech-to-text/speech-recognition#browser-support) section for detailed information.
59+
4160
## See Also
4261

4362
* [Text-to-Speech](./text-to-speech.md)
Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
#container {
2+
visibility: hidden;
3+
margin: 20px auto;
4+
width: 350px;
5+
}
6+
7+
#loader {
8+
color: #008cff;
9+
height: 40px;
10+
left: 45%;
11+
position: absolute;
12+
top: 45%;
13+
width: 30%;
14+
}
Lines changed: 98 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,98 @@
1+
let currentUserModel = {
2+
id: "user1",
3+
user: "Albert"
4+
};
5+
6+
let michaleUserModel = {
7+
id: "user2",
8+
user: "Michale Suyama"
9+
};
10+
11+
let chatMessages = [
12+
{
13+
author: currentUserModel,
14+
text: "Hi Michale, are we on track for the deadline?"
15+
},
16+
{
17+
author: michaleUserModel,
18+
text: "Yes, the design phase is complete."
19+
},
20+
{
21+
author: currentUserModel,
22+
text: "I’ll review it and send feedback by today."
23+
}
24+
];
25+
// Initializes the Chat UI control
26+
let chatUI = new ej.interactivechat.ChatUI({
27+
messages: chatMessages,
28+
user: currentUserModel,
29+
footerTemplate: "#footerContent"
30+
});
31+
32+
// Render initialized Chat UI.
33+
chatUI.appendTo('#chatui');
34+
35+
// Initialize Speech-to-Text component
36+
var speechToTextObj = new ej.inputs.SpeechToText({
37+
transcriptChanged: onTranscriptChange,
38+
onStop: onListeningStop,
39+
created: onCreated,
40+
cssClass: 'e-flat'
41+
});
42+
speechToTextObj.appendTo('#speechToText');
43+
44+
// Updates transcript in the input area when speech-to-text transcribes
45+
function onTranscriptChange(args) {
46+
document.querySelector('#chatui-footer').innerText = args.transcript;
47+
}
48+
49+
// Handles actions when speech listening stops
50+
function onListeningStop() {
51+
toggleButtons();
52+
}
53+
54+
// Handles actions after component creation
55+
function onCreated() {
56+
var chatuiFooter = document.querySelector('#chatui-footer');
57+
var sendButton = document.querySelector('#chatui-sendButton');
58+
59+
sendButton.addEventListener('click', sendIconClicked);
60+
chatuiFooter.addEventListener('input', toggleButtons);
61+
62+
chatuiFooter.addEventListener('keydown', function (e) {
63+
if (e.key === 'Enter' && !e.shiftKey) {
64+
sendIconClicked();
65+
e.preventDefault(); // Prevent the default behavior of the Enter key
66+
}
67+
});
68+
toggleButtons();
69+
}
70+
71+
// Toggles the visibility of the send and speech-to-text buttons
72+
function toggleButtons() {
73+
var chatuiFooter = document.querySelector('#chatui-footer');
74+
var sendButton = document.querySelector('#chatui-sendButton');
75+
var speechButton = document.querySelector('#speechToText');
76+
77+
var hasText = chatuiFooter.innerText.trim() !== '';
78+
sendButton.classList.toggle('visible', hasText);
79+
speechButton.classList.toggle('visible', !hasText);
80+
81+
if (!hasText && (chatuiFooter.innerHTML === '<br>' || !chatuiFooter.innerHTML.trim())) {
82+
chatuiFooter.innerHTML = '';
83+
}
84+
}
85+
86+
// Handles send button click event
87+
function sendIconClicked() {
88+
var editor = document.querySelector('#chatui-footer');
89+
const messageContent = editor?.innerText || '';
90+
if (messageContent.trim()) {
91+
chatUI.addMessage({
92+
author: currentUserModel,
93+
text: messageContent,
94+
});
95+
editor.innerText = '';
96+
toggleButtons(); // Update button visibility
97+
}
98+
}
Lines changed: 99 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,99 @@
1+
import { ChatUI, UserModel, MessageModel } from '@syncfusion/ej2-interactive-chat';
2+
import { SpeechToText, TranscriptChangedEventArgs } from '@syncfusion/ej2-inputs';
3+
4+
let currentUserModel: UserModel = {
5+
id: "user1",
6+
user: "Albert"
7+
};
8+
9+
let michaleUserModel: UserModel = {
10+
id: "user2",
11+
user: "Michale Suyama"
12+
};
13+
let chatMessages: MessageModel[] = [
14+
{
15+
author: currentUserModel,
16+
text: "Hi Michale, are we on track for the deadline?"
17+
},
18+
{
19+
author: michaleUserModel,
20+
text: "Yes, the design phase is complete."
21+
},
22+
{
23+
author: currentUserModel,
24+
text: "I’ll review it and send feedback by today."
25+
}
26+
];
27+
// Initializes the Chat UI control
28+
let chatUI: ChatUI = new ChatUI({
29+
messages: chatMessages,
30+
user: currentUserModel,
31+
footerTemplate: "#footerContent"
32+
});
33+
34+
// Render initialized Chat UI.
35+
chatUI.appendTo('#chatui');
36+
37+
// Initialize Speech-to-Text component
38+
let speechToTextObj: SpeechToText = new SpeechToText({
39+
transcriptChanged: onTranscriptChange,
40+
onStop: onListeningStop,
41+
created: onCreated,
42+
cssClass: 'e-flat',
43+
});
44+
speechToTextObj.appendTo('#speechToText');
45+
46+
// Handles actions when speech listening stops
47+
function onListeningStop() {
48+
toggleButtons();
49+
}
50+
51+
function onTranscriptChange(args: TranscriptChangedEventArgs): void {
52+
(document.querySelector('#chatui-footer') as HTMLElement).innerText = args.transcript;
53+
}
54+
55+
// Handles actions after component creation
56+
function onCreated(): void {
57+
let chatuiFooter = document.querySelector('#chatui-footer') as HTMLElement;
58+
let sendButton = document.querySelector('#chatui-sendButton') as HTMLElement;
59+
60+
sendButton.addEventListener('click', sendIconClicked);
61+
chatuiFooter.addEventListener('input', toggleButtons);
62+
63+
chatuiFooter.addEventListener('keydown', function (e) {
64+
if (e.key === 'Enter' && !e.shiftKey) {
65+
sendIconClicked();
66+
e.preventDefault(); // Prevent the default behavior of the Enter key
67+
}
68+
});
69+
toggleButtons();
70+
}
71+
72+
// Toggles the visibility of the send and speech-to-text buttons
73+
function toggleButtons(): void {
74+
let chatuiFooter = document.querySelector('#chatui-footer') as HTMLElement;
75+
let sendButton = document.querySelector('#chatui-sendButton') as HTMLElement;
76+
let speechButton = document.querySelector('#speechToText') as HTMLElement;
77+
78+
let hasText = chatuiFooter.innerText.trim() !== '';
79+
sendButton.classList.toggle('visible', hasText);
80+
speechButton.classList.toggle('visible', !hasText);
81+
82+
if (!hasText && (chatuiFooter.innerHTML === '<br>' || !chatuiFooter.innerHTML.trim())) {
83+
chatuiFooter.innerHTML = '';
84+
}
85+
}
86+
87+
// Handles send button click event
88+
function sendIconClicked(): void {
89+
var editor = document.querySelector('#chatui-footer') as HTMLElement;
90+
const messageContent = editor?.innerText || '';
91+
if (messageContent.trim()) {
92+
chatUI.addMessage({
93+
author: currentUserModel,
94+
text: messageContent,
95+
});
96+
editor.innerText = '';
97+
toggleButtons(); // Update button visibility
98+
}
99+
}
Lines changed: 98 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,98 @@
1+
<!DOCTYPE html>
2+
<html lang="en">
3+
4+
<head>
5+
<title>EJ2 Chat UI</title>
6+
<meta charset="utf-8" />
7+
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
8+
<meta name="description" content="TypeScript Chat UI Control" />
9+
<meta name="author" content="Syncfusion" />
10+
<link href="index.css" rel="stylesheet" />
11+
<link href="https://cdn.syncfusion.com/ej2/20.3.56/ej2-base/styles/material.css" rel="stylesheet" />
12+
<link href="https://cdn.syncfusion.com/ej2/20.3.56/ej2-interactive-chat/styles/material.css" rel="stylesheet" />
13+
<link href="https://cdn.syncfusion.com/ej2/20.3.56/ej2-inputs/styles/material.css" rel="stylesheet" />
14+
<link href="https://cdn.syncfusion.com/ej2/20.3.56/ej2-buttons/styles/material.css" rel="stylesheet" />
15+
<link href="https://cdn.syncfusion.com/ej2/20.3.56/ej2-navigations/styles/material.css" rel="stylesheet" />
16+
<link href="https://cdn.syncfusion.com/ej2/20.3.56/ej2-notifications/styles/material.css" rel="stylesheet" />
17+
<script src="https://cdnjs.cloudflare.com/ajax/libs/systemjs/0.19.38/system.js"></script>
18+
<script src="https://cdn.jsdelivr.net/npm/marked/marked.min.js"></script>
19+
<script src="https://cdn.syncfusion.com/ej2/20.4.38/dist/ej2.min.js" type="text/javascript"></script>
20+
<script src="systemjs.config.js"></script>
21+
<style>
22+
.integration-speechtotext {
23+
height: 400px;
24+
width: 450px;
25+
margin: 0 auto;
26+
}
27+
28+
.integration-speechtotext #chatui-sendButton {
29+
width: 40px;
30+
height: 40px;
31+
font-size: 15px;
32+
border: none;
33+
background: none;
34+
cursor: pointer;
35+
}
36+
37+
.integration-speechtotext #speechToText.visible,
38+
.integration-speechtotext #chatui-sendButton.visible {
39+
display: inline-block;
40+
}
41+
42+
.integration-speechtotext #speechToText,
43+
.integration-speechtotext #chatui-sendButton {
44+
display: none;
45+
}
46+
47+
@media only screen and (max-width: 750px) {
48+
.integration-speechtotext {
49+
width: 100%;
50+
}
51+
}
52+
53+
.integration-speechtotext .e-footer-wrapper {
54+
display: flex;
55+
border: 1px solid #c1c1c1;
56+
margin: 5px 5px 0 5px;
57+
border-radius: 10px;
58+
padding: 5px;
59+
}
60+
61+
.integration-speechtotext .content-editor {
62+
width: 100%;
63+
overflow-y: auto;
64+
font-size: 14px;
65+
min-height: 20px;
66+
max-height: 150px;
67+
padding: 10px;
68+
}
69+
70+
.integration-speechtotext .content-editor[contentEditable='true']:empty:before {
71+
content: attr(placeholder);
72+
color: #6b7280;
73+
font-style: italic;
74+
}
75+
76+
.integration-speechtotext .option-container {
77+
align-self: flex-end;
78+
}
79+
</style>
80+
</head>
81+
82+
<body>
83+
<div id='loader'>Loading....</div>
84+
<div class="integration-speechtotext">
85+
<div id="chatui"></div>
86+
</div>
87+
<script id="footerContent" type="text/x-jsrender">
88+
<div class="e-footer-wrapper">
89+
<div id="chatui-footer" class="content-editor" contenteditable="true" placeholder="Click to speak or start typing..."></div>
90+
<div class="option-container">
91+
<button id="speechToText"></button>
92+
<button id="chatui-sendButton" class="e-assist-send e-icons" role="button"></button>
93+
</div>
94+
</div>
95+
</script>
96+
</body>
97+
98+
</html>

0 commit comments

Comments
 (0)