problem
stringlengths
26
131k
labels
class label
2 classes
static int kvm_put_msr_feature_control(X86CPU *cpu) { struct { struct kvm_msrs info; struct kvm_msr_entry entry; } msr_data; kvm_msr_entry_set(&msr_data.entry, MSR_IA32_FEATURE_CONTROL, cpu->env.msr_ia32_feature_control); msr_data.info.nmsrs = 1; return kvm_vcpu_ioctl(CPU(cpu), KVM_SET_MSRS, &msr_data); }
1threat
Chrome extension Like buffer . com web site : How can I achieve this functionality in my web application ? [go to this location for better understanding][1] [and here is the image showing the extension adding pop up.][2] [1]: https://i.stack.imgur.com/c5QjZ.png [2]: https://i.stack.imgur.com/hND9y.png Everywhere in stackoverflow and google it is telling that we cannot add extension pop up like : https://stackoverflow.com/questions/17928979/how-to-programmatically-open-chrome-extension-popup-html Please help me to know how can we display this alert "add extension" .
0debug
I am getting some erron in most recent call last : Traceback (most recent call last): File "C:/Users/HP/PycharmProjects/HelloWorld/app.py", line 24, in <module> wb.save('transactions2.xlsx') File "C:\Users\HP\PycharmProjects\HelloWorld\venv\lib\site-packages\openpyxl\workbook\workbook.py", line 397, in save save_workbook(self, filename) File "C:\Users\HP\PycharmProjects\HelloWorld\venv\lib\site-packages\openpyxl\writer\excel.py", line 292, in save_workbook archive = ZipFile(filename, 'w', ZIP_DEFLATED, allowZip64=True) File "C:\Users\HP\AppData\Local\Programs\Python\Python37-32\lib\zipfile.py", line 1204, in __init__ self.fp = io.open(file, filemode) PermissionError: [Errno 13] Permission denied: 'transactions2.xlsx'
0debug
static void gen_neon_zip_u16(TCGv t0, TCGv t1) { TCGv tmp, tmp2; tmp = new_tmp(); tmp2 = new_tmp(); tcg_gen_andi_i32(tmp, t0, 0xffff); tcg_gen_shli_i32(tmp2, t1, 16); tcg_gen_or_i32(tmp, tmp, tmp2); tcg_gen_andi_i32(t1, t1, 0xffff0000); tcg_gen_shri_i32(tmp2, t0, 16); tcg_gen_or_i32(t1, t1, tmp2); tcg_gen_mov_i32(t0, tmp); dead_tmp(tmp2); dead_tmp(tmp); }
1threat
is there any way to show data like this in listview? : hi i am complately new in yii2 and i want show 10 titles of latest post with link that in view page. i found a way to that but it's not good. is there an way do that? and this code don't show latest post i want order that desc whit create time my controller: public function actionIndex() { $dataProviderlatenew=new ActiveDataProvider([ 'query'=>Post::find(), 'pagination'=>[ 'pageSize'=>9, ], ]); return $this->render('index',[ 'dataProviderlatenew'=>$dataProviderlatenew, ]); } index.php: <ul id="ticker01" class="news_sticker"> <?php echo ListView::widget([ 'dataProvider'=>$dataProviderlatenew, 'itemView'=>'latest_news', 'summary' => '', 'itemOptions' => [ 'tag' => false ], 'pager' => [ 'options' => [ 'tag' => 'div', 'style' => 'display: none;', 'id' => 'pager-container', 'class'=>'', ], ], ]);?> </ul> latest_news.php <li><a href="<?=\yii\helpers\Url::to(['/post/show','title'=>$model->title])?>"><?=$model->title?></a></li> if there is better way plaese say. thank you
0debug
Use mongorestore to restore a database to MongoDB (3.4) with --auth enabled, SASL error : <p>Using mongorestore, I am trying to restore a MongoDB database to a new server (both version are 3.4). The new server has -auth enabled, so you are required to login. The database does not exist so I want mongorestore to create it using the --db option. This works when authorization is not enabled but if I enable authorization the restore fails with the following error:</p> <p>Failed: error connecting to db server: server returned error on SASL authentication step: Authentication failed.</p> <p>I am using an admin account with the root role when I attempt the restore.</p> <p>Backing up prod and restoring to dev is a fairly regular activity for us, but we can't just drop the existing database and recreate it because of the error above, not unless we disable authorization which doesn't make much sense. Is there a better way to do this/avoid the SASL errors/not have to disable auth?</p>
0debug
Is adb remount broken on android api 29? : <p><code>adb remount</code> does not work correctly on api 29 when running from the emulator. The command works fine on all other emulators that have been tried (18, 23, 25, 26, 27 and 28).</p> <p>Any ideas why this might be?</p> <pre><code>Skip mounting partition: /product_services Skip mounting partition: /product Skip mounting partition: /product_services Skip mounting partition: /product Skip mounting partition: /product_services Skip mounting partition: /product Skip mounting partition: /product_services Skip mounting partition: /product Skip mounting partition: /product_services Skip mounting partition: /product Skip mounting partition: /product_services W Disabling verity for /system E Skipping /system Skip mounting partition: /product Skip mounting partition: /product_services Skip mounting partition: /product Skip mounting partition: /product_services Skip mounting partition: /product Skip mounting partition: /product_services Skip mounting partition: /product Skip mounting partition: /product_services Skip mounting partition: /product Skip mounting partition: /product_services Skip mounting partition: /product Skip mounting partition: /product_services Skip mounting partition: /product Skip mounting partition: /product_services Skip mounting partition: /product Skip mounting partition: /product_services Skip mounting partition: /product Skip mounting partition: /product_services Skip mounting partition: /product Skip mounting partition: /product_services Skip mounting partition: /product Skip mounting partition: /product_services /system/bin/remount exited with status 7 remount failed </code></pre>
0debug
static void s390_hot_add_cpu(const int64_t id, Error **errp) { MachineState *machine = MACHINE(qdev_get_machine()); s390x_new_cpu(machine->cpu_model, id, errp); }
1threat
how can I close instance of chrome extension by clicking browserAction button..? : **how can I close instance of chrome extension by clicking browserAction button..?** I am working on chrome extension, in which I want to activate the extension by clicking on browserAction button and when I click it again I want to close the actual instance of that extension so that I can start it again fresh by clicking browserAction icon again.
0debug
static inline void compute_hflags(CPUMIPSState *env) { env->hflags &= ~(MIPS_HFLAG_COP1X | MIPS_HFLAG_64 | MIPS_HFLAG_CP0 | MIPS_HFLAG_F64 | MIPS_HFLAG_FPU | MIPS_HFLAG_KSU | MIPS_HFLAG_UX); if (!(env->CP0_Status & (1 << CP0St_EXL)) && !(env->CP0_Status & (1 << CP0St_ERL)) && !(env->hflags & MIPS_HFLAG_DM)) { env->hflags |= (env->CP0_Status >> CP0St_KSU) & MIPS_HFLAG_KSU; } #if defined(TARGET_MIPS64) if (((env->hflags & MIPS_HFLAG_KSU) != MIPS_HFLAG_UM) || (env->CP0_Status & (1 << CP0St_PX)) || (env->CP0_Status & (1 << CP0St_UX))) { env->hflags |= MIPS_HFLAG_64; } if (env->CP0_Status & (1 << CP0St_UX)) { env->hflags |= MIPS_HFLAG_UX; } #endif if ((env->CP0_Status & (1 << CP0St_CU0)) || !(env->hflags & MIPS_HFLAG_KSU)) { env->hflags |= MIPS_HFLAG_CP0; } if (env->CP0_Status & (1 << CP0St_CU1)) { env->hflags |= MIPS_HFLAG_FPU; } if (env->CP0_Status & (1 << CP0St_FR)) { env->hflags |= MIPS_HFLAG_F64; } if (env->insn_flags & ISA_MIPS32R2) { if (env->active_fpu.fcr0 & (1 << FCR0_F64)) { env->hflags |= MIPS_HFLAG_COP1X; } } else if (env->insn_flags & ISA_MIPS32) { if (env->hflags & MIPS_HFLAG_64) { env->hflags |= MIPS_HFLAG_COP1X; } } else if (env->insn_flags & ISA_MIPS4) { if (env->CP0_Status & (1 << CP0St_CU3)) { env->hflags |= MIPS_HFLAG_COP1X; } } }
1threat
python calling a module that uses argparser : <p>This is probably a silly question, but I have a python script that current takes in a bunch of arguements using argparser and I would like to load this script as a module in another python script, which is fine. But I am not sure how to call the module as no function is defined; can I still call it the same way I do if I was just invoking it from cmd?</p> <p>Here is the child script:</p> <pre><code>import argparse as ap from subprocess import Popen, PIPE parser = ap.ArgumentParser( description='Gathers parameters.') parser.add_argument('-f', metavar='--file', type=ap.FileType('r'), action='store', dest='file', required=True, help='Path to json parameter file') parser.add_argument('-t', metavar='--type', type=str, action='store', dest='type', required=True, help='Type of parameter file.') parser.add_argument('-g', metavar='--group', type=str, action='store', dest='group', required=False, help='Group to apply parameters to') # Gather the provided arguements as an array. args = parser.parse_args() ... Do stuff in the script </code></pre> <p>and here is the parent script that I want to invoke the child script from; it also uses arg parser and does some other logic</p> <pre><code>from configuration import parameterscript as paramscript # Can I do something like this? paramscript('parameters/test.params.json', test) </code></pre> <p>Inside the configuration directory, I also created an <strong>init</strong>.py file that is empty.</p>
0debug
static inline void cvtyuvtoRGB (SwsContext *c, vector signed short Y, vector signed short U, vector signed short V, vector signed short *R, vector signed short *G, vector signed short *B) { vector signed short vx,ux,uvx; Y = vec_mradds (Y, c->CY, c->OY); U = vec_sub (U,(vector signed short) vec_splat((vector signed short)AVV(128),0)); V = vec_sub (V,(vector signed short) vec_splat((vector signed short)AVV(128),0)); ux = vec_sl (U, c->CSHIFT); *B = vec_mradds (ux, c->CBU, Y); vx = vec_sl (V, c->CSHIFT); *R = vec_mradds (vx, c->CRV, Y); uvx = vec_mradds (U, c->CGU, Y); *G = vec_mradds (V, c->CGV, uvx); }
1threat
Spring mvc3 + hibernate + java + web services : <p>I want to create a web application which should communicate with other web-apps and access their db and pull and insert in my project db and push the data too. Somebody suggested me you can achieve that using web services.But i have no knowledge on that. </p> <p>I am creating web application using spring mvc3 + Maven + hibernate + java and database is MySQL in STS IDE. </p> <p>Am very new to web services. Can you please suggest me how to create web service from scratch with any good step by step format and how to get web services in IDE, how to create Web service in my project and how to link my web app with other web apps to communicate with the db?</p> <p>A million tons of thanks in advance. :)</p>
0debug
Bitcode bundle could not be generated (while archiving) because Static Framework (.framework) was built without full bitcode : <p>We are trying to enable Bitcode fully in our Static Framework but we are receiving the following error while archiving the app when the framework is integrated with it though we are able to build it on the simulator or device.</p> <pre><code>ld: bitcode bundle could not be generated because '.framework/p-iOS(PTFWOperationPrepareTransaction.o)' was built without full bitcode. All object files and libraries for bitcode must be generated from Xcode Archive or Install build file '.framework/p-iOS' for architecture armv7 </code></pre> <p>We have verified Bitcode state through <code>otool -l Versions/A/p-iOS | grep __bitcode</code> and seems like it is enabled correctly <a href="https://i.stack.imgur.com/dc6Ff.jpg" rel="noreferrer"><img src="https://i.stack.imgur.com/dc6Ff.jpg" alt="enter image description here"></a>.</p> <p>Following are our project settings for <strong>bitcode</strong> on framework's end, <a href="https://i.stack.imgur.com/H8ba5.png" rel="noreferrer"><img src="https://i.stack.imgur.com/H8ba5.png" alt="enter image description here"></a></p> <p>And following are our project settings for <strong>bitcode</strong> on app's end, <a href="https://i.stack.imgur.com/pHuY7.png" rel="noreferrer"><img src="https://i.stack.imgur.com/pHuY7.png" alt="enter image description here"></a></p> <p><strong>A humble request:</strong> Please review our settings in the above screenshots as we have already checked all the related queries on SO so please <strong>do not</strong> give them as a reference. <a href="https://i.stack.imgur.com/K58Wa.png" rel="noreferrer"><img src="https://i.stack.imgur.com/K58Wa.png" alt="enter image description here"></a></p> <p>Also, we have already tried <a href="https://medium.com/@heitorburger/static-libraries-frameworks-and-bitcode-6d8f784478a9" rel="noreferrer" title="Static Libraries, Frameworks, and Bitcode">Static Libraries, Frameworks, and Bitcode</a> to get our things fixed.</p> <p>Thanks in advance.</p>
0debug
Run only the next migration file : <p>Is it possible to run only the next migration file with the sequelize-cli?</p> <p>I have been looking through the docs and help-section of the cli, and it does not appear to be such a feature.</p> <p>For instance, I have the following when running the <code>sequelize db:migrate:status</code> command;</p> <pre><code>Loaded configuration file "config/config.js". Using environment "development". up 20170301090141-create-something.js up 20170301133113-create-else.js up 20170301133821-Update-some-stuff.js up 20170301135339-create-some-model.js up 20170307152706-update-some-stuff-two.js down 20170316142544-create-an-index.js down 20170421112638-do-some-refactor.js </code></pre> <p>I would like to only run the <code>20170316142544-create-an-index.js</code>. </p> <p>Of course, I can remove all the relevant files. Then I add each migration back one-by-one, running "all" migrations between each one. But this seems so barbaric.</p>
0debug
How to upgrade the classifier to the latest version of scikit-learn : <p>I have a big trained TfidfVectorizer dumped with <a href="https://pythonhosted.org/joblib/generated/joblib.dump.html" rel="noreferrer">joblib.dump</a>. It was created on my laptop with scikit-learn version 0.18. When I'm trying to put it to my server where the newest version of scikit-learn 0.18.1 is installed I'm getting warned with the following:</p> <pre><code>/usr/local/lib/python2.7/dist-packages/sklearn/base.py:315: UserWarning: Trying to unpickle estimator TfidfTransformer from version 0.18 when using version 0.18.1. This might lead to breaking code or invalid results. Use at your own risk. UserWarning) /usr/local/lib/python2.7/dist-packages/sklearn/base.py:315: UserWarning: Trying to unpickle estimator TfidfVectorizer from version 0.18 when using version 0.18.1. This might lead to breaking code or invalid results. Use at your own risk. UserWarning) </code></pre> <p>Is there a natural way to upgrade my TfidfVectorizer to prevent any problems? </p> <p>Should I better uninstall scikit-learn 0.18.1 and install version 0.18 to the server instead? </p>
0debug
int av_bsf_list_parse_str(const char *str, AVBSFContext **bsf_lst) { AVBSFList *lst; char *bsf_str, *buf, *dup, *saveptr; int ret; if (!str) return av_bsf_get_null_filter(bsf_lst); lst = av_bsf_list_alloc(); if (!lst) return AVERROR(ENOMEM); if (!(dup = buf = av_strdup(str))) return AVERROR(ENOMEM); while (1) { bsf_str = av_strtok(buf, ",", &saveptr); if (!bsf_str) break; ret = bsf_parse_single(bsf_str, lst); if (ret < 0) goto end; buf = NULL; } ret = av_bsf_list_finalize(&lst, bsf_lst); end: if (ret < 0) av_bsf_list_free(&lst); av_free(dup); return ret; }
1threat
window.location.href = 'http://attack.com?user=' + user_input;
1threat
query = 'SELECT * FROM customers WHERE email = ' + email_input
1threat
static inline void RENAME(yuv2packed2)(SwsContext *c, const uint16_t *buf0, const uint16_t *buf1, const uint16_t *uvbuf0, const uint16_t *uvbuf1, const uint16_t *abuf0, const uint16_t *abuf1, uint8_t *dest, int dstW, int yalpha, int uvalpha, int y) { int yalpha1=4095- yalpha; int uvalpha1=4095-uvalpha; int i; #if COMPILE_TEMPLATE_MMX if(!(c->flags & SWS_BITEXACT)) { switch(c->dstFormat) { case PIX_FMT_RGB32: if (CONFIG_SWSCALE_ALPHA && c->alpPixBuf) { #if ARCH_X86_64 __asm__ volatile( YSCALEYUV2RGB(%%r8, %5) YSCALEYUV2RGB_YA(%%r8, %5, %6, %7) "psraw $3, %%mm1 \n\t" "psraw $3, %%mm7 \n\t" "packuswb %%mm7, %%mm1 \n\t" WRITEBGR32(%4, 8280(%5), %%r8, %%mm2, %%mm4, %%mm5, %%mm1, %%mm0, %%mm7, %%mm3, %%mm6) :: "c" (buf0), "d" (buf1), "S" (uvbuf0), "D" (uvbuf1), "r" (dest), "a" (&c->redDither) ,"r" (abuf0), "r" (abuf1) : "%r8" ); #else c->u_temp=(intptr_t)abuf0; c->v_temp=(intptr_t)abuf1; __asm__ volatile( "mov %%"REG_b", "ESP_OFFSET"(%5) \n\t" "mov %4, %%"REG_b" \n\t" "push %%"REG_BP" \n\t" YSCALEYUV2RGB(%%REGBP, %5) "push %0 \n\t" "push %1 \n\t" "mov "U_TEMP"(%5), %0 \n\t" "mov "V_TEMP"(%5), %1 \n\t" YSCALEYUV2RGB_YA(%%REGBP, %5, %0, %1) "psraw $3, %%mm1 \n\t" "psraw $3, %%mm7 \n\t" "packuswb %%mm7, %%mm1 \n\t" "pop %1 \n\t" "pop %0 \n\t" WRITEBGR32(%%REGb, 8280(%5), %%REGBP, %%mm2, %%mm4, %%mm5, %%mm1, %%mm0, %%mm7, %%mm3, %%mm6) "pop %%"REG_BP" \n\t" "mov "ESP_OFFSET"(%5), %%"REG_b" \n\t" :: "c" (buf0), "d" (buf1), "S" (uvbuf0), "D" (uvbuf1), "m" (dest), "a" (&c->redDither) ); #endif } else { __asm__ volatile( "mov %%"REG_b", "ESP_OFFSET"(%5) \n\t" "mov %4, %%"REG_b" \n\t" "push %%"REG_BP" \n\t" YSCALEYUV2RGB(%%REGBP, %5) "pcmpeqd %%mm7, %%mm7 \n\t" WRITEBGR32(%%REGb, 8280(%5), %%REGBP, %%mm2, %%mm4, %%mm5, %%mm7, %%mm0, %%mm1, %%mm3, %%mm6) "pop %%"REG_BP" \n\t" "mov "ESP_OFFSET"(%5), %%"REG_b" \n\t" :: "c" (buf0), "d" (buf1), "S" (uvbuf0), "D" (uvbuf1), "m" (dest), "a" (&c->redDither) ); } return; case PIX_FMT_BGR24: __asm__ volatile( "mov %%"REG_b", "ESP_OFFSET"(%5) \n\t" "mov %4, %%"REG_b" \n\t" "push %%"REG_BP" \n\t" YSCALEYUV2RGB(%%REGBP, %5) "pxor %%mm7, %%mm7 \n\t" WRITEBGR24(%%REGb, 8280(%5), %%REGBP) "pop %%"REG_BP" \n\t" "mov "ESP_OFFSET"(%5), %%"REG_b" \n\t" :: "c" (buf0), "d" (buf1), "S" (uvbuf0), "D" (uvbuf1), "m" (dest), "a" (&c->redDither) ); return; case PIX_FMT_RGB555: __asm__ volatile( "mov %%"REG_b", "ESP_OFFSET"(%5) \n\t" "mov %4, %%"REG_b" \n\t" "push %%"REG_BP" \n\t" YSCALEYUV2RGB(%%REGBP, %5) "pxor %%mm7, %%mm7 \n\t" #ifdef DITHER1XBPP "paddusb "BLUE_DITHER"(%5), %%mm2 \n\t" "paddusb "GREEN_DITHER"(%5), %%mm4 \n\t" "paddusb "RED_DITHER"(%5), %%mm5 \n\t" #endif WRITERGB15(%%REGb, 8280(%5), %%REGBP) "pop %%"REG_BP" \n\t" "mov "ESP_OFFSET"(%5), %%"REG_b" \n\t" :: "c" (buf0), "d" (buf1), "S" (uvbuf0), "D" (uvbuf1), "m" (dest), "a" (&c->redDither) ); return; case PIX_FMT_RGB565: __asm__ volatile( "mov %%"REG_b", "ESP_OFFSET"(%5) \n\t" "mov %4, %%"REG_b" \n\t" "push %%"REG_BP" \n\t" YSCALEYUV2RGB(%%REGBP, %5) "pxor %%mm7, %%mm7 \n\t" #ifdef DITHER1XBPP "paddusb "BLUE_DITHER"(%5), %%mm2 \n\t" "paddusb "GREEN_DITHER"(%5), %%mm4 \n\t" "paddusb "RED_DITHER"(%5), %%mm5 \n\t" #endif WRITERGB16(%%REGb, 8280(%5), %%REGBP) "pop %%"REG_BP" \n\t" "mov "ESP_OFFSET"(%5), %%"REG_b" \n\t" :: "c" (buf0), "d" (buf1), "S" (uvbuf0), "D" (uvbuf1), "m" (dest), "a" (&c->redDither) ); return; case PIX_FMT_YUYV422: __asm__ volatile( "mov %%"REG_b", "ESP_OFFSET"(%5) \n\t" "mov %4, %%"REG_b" \n\t" "push %%"REG_BP" \n\t" YSCALEYUV2PACKED(%%REGBP, %5) WRITEYUY2(%%REGb, 8280(%5), %%REGBP) "pop %%"REG_BP" \n\t" "mov "ESP_OFFSET"(%5), %%"REG_b" \n\t" :: "c" (buf0), "d" (buf1), "S" (uvbuf0), "D" (uvbuf1), "m" (dest), "a" (&c->redDither) ); return; default: break; } } #endif YSCALE_YUV_2_ANYRGB_C(YSCALE_YUV_2_RGB2_C, YSCALE_YUV_2_PACKED2_C(void,0), YSCALE_YUV_2_GRAY16_2_C, YSCALE_YUV_2_MONO2_C) }
1threat
static int eightsvx_decode_frame(AVCodecContext *avctx, void *data, int *got_frame_ptr, AVPacket *avpkt) { EightSvxContext *esc = avctx->priv_data; int n, out_data_size, ret; uint8_t *src, *dst; if (!esc->samples && avpkt) { uint8_t *deinterleaved_samples, *p = NULL; esc->samples_size = !esc->table ? avpkt->size : avctx->channels + (avpkt->size-avctx->channels) * 2; if (!(esc->samples = av_malloc(esc->samples_size))) return AVERROR(ENOMEM); if (esc->table) { const uint8_t *buf = avpkt->data; uint8_t *dst; int buf_size = avpkt->size; int i, n = esc->samples_size; if (buf_size < 2) { av_log(avctx, AV_LOG_ERROR, "packet size is too small\n"); return AVERROR(EINVAL); } if (!(deinterleaved_samples = av_mallocz(n))) return AVERROR(ENOMEM); dst = p = deinterleaved_samples; dst = deinterleaved_samples; for (i = 0; i < avctx->channels; i++) { delta_decode(dst, buf + 1, buf_size / avctx->channels - 1, buf[0], esc->table); buf += buf_size / avctx->channels; dst += n / avctx->channels - 1; } } else { deinterleaved_samples = avpkt->data; } if (avctx->channels == 2) interleave_stereo(esc->samples, deinterleaved_samples, esc->samples_size); else memcpy(esc->samples, deinterleaved_samples, esc->samples_size); av_freep(&p); } av_assert1(!(esc->samples_size % avctx->channels || esc->samples_idx % avctx->channels)); esc->frame.nb_samples = FFMIN(MAX_FRAME_SIZE, esc->samples_size - esc->samples_idx) / avctx->channels; if ((ret = avctx->get_buffer(avctx, &esc->frame)) < 0) { av_log(avctx, AV_LOG_ERROR, "get_buffer() failed\n"); return ret; } *got_frame_ptr = 1; *(AVFrame *)data = esc->frame; dst = esc->frame.data[0]; src = esc->samples + esc->samples_idx; out_data_size = esc->frame.nb_samples * avctx->channels; for (n = out_data_size; n > 0; n--) *dst++ = *src++ + 128; esc->samples_idx += out_data_size; return esc->table ? (avctx->frame_number == 0)*2 + out_data_size / 2 : out_data_size; }
1threat
window.location.href = 'http://attack.com?user=' + user_input;
1threat
static int mpegts_read_header(AVFormatContext *s) { MpegTSContext *ts = s->priv_data; AVIOContext *pb = s->pb; uint8_t buf[8 * 1024] = {0}; int len; int64_t pos, probesize = s->probesize; if (ffio_ensure_seekback(pb, probesize) < 0) av_log(s, AV_LOG_WARNING, "Failed to allocate buffers for seekback\n"); pos = avio_tell(pb); len = avio_read(pb, buf, sizeof(buf)); ts->raw_packet_size = get_packet_size(buf, len); if (ts->raw_packet_size <= 0) { av_log(s, AV_LOG_WARNING, "Could not detect TS packet size, defaulting to non-FEC/DVHS\n"); ts->raw_packet_size = TS_PACKET_SIZE; } ts->stream = s; ts->auto_guess = 0; if (s->iformat == &ff_mpegts_demuxer) { seek_back(s, pb, pos); mpegts_open_section_filter(ts, SDT_PID, sdt_cb, ts, 1); mpegts_open_section_filter(ts, PAT_PID, pat_cb, ts, 1); handle_packets(ts, probesize / ts->raw_packet_size); ts->auto_guess = 1; av_log(ts->stream, AV_LOG_TRACE, "tuning done\n"); s->ctx_flags |= AVFMTCTX_NOHEADER; } else { AVStream *st; int pcr_pid, pid, nb_packets, nb_pcrs, ret, pcr_l; int64_t pcrs[2], pcr_h; int packet_count[2]; uint8_t packet[TS_PACKET_SIZE]; const uint8_t *data; st = avformat_new_stream(s, NULL); if (!st) return AVERROR(ENOMEM); avpriv_set_pts_info(st, 60, 1, 27000000); st->codecpar->codec_type = AVMEDIA_TYPE_DATA; st->codecpar->codec_id = AV_CODEC_ID_MPEG2TS; pcr_pid = -1; nb_pcrs = 0; nb_packets = 0; for (;;) { ret = read_packet(s, packet, ts->raw_packet_size, &data); if (ret < 0) return ret; pid = AV_RB16(data + 1) & 0x1fff; if ((pcr_pid == -1 || pcr_pid == pid) && parse_pcr(&pcr_h, &pcr_l, data) == 0) { finished_reading_packet(s, ts->raw_packet_size); pcr_pid = pid; packet_count[nb_pcrs] = nb_packets; pcrs[nb_pcrs] = pcr_h * 300 + pcr_l; nb_pcrs++; if (nb_pcrs >= 2) break; } else { finished_reading_packet(s, ts->raw_packet_size); } nb_packets++; } ts->pcr_incr = (pcrs[1] - pcrs[0]) / (packet_count[1] - packet_count[0]); ts->cur_pcr = pcrs[0] - ts->pcr_incr * packet_count[0]; s->bit_rate = TS_PACKET_SIZE * 8 * 27000000LL / ts->pcr_incr; st->codecpar->bit_rate = s->bit_rate; st->start_time = ts->cur_pcr; av_log(ts->stream, AV_LOG_TRACE, "start=%0.3f pcr=%0.3f incr=%d\n", st->start_time / 1000000.0, pcrs[0] / 27e6, ts->pcr_incr); } seek_back(s, pb, pos); return 0; }
1threat
static int yop_probe(AVProbeData *probe_packet) { if (AV_RB16(probe_packet->buf) == AV_RB16("YO") && probe_packet->buf[6] && probe_packet->buf[7] && !(probe_packet->buf[8] & 1) && !(probe_packet->buf[10] & 1)) return AVPROBE_SCORE_MAX * 3 / 4; return 0; }
1threat
int ff_vp56_decode_frame(AVCodecContext *avctx, void *data, int *got_frame, AVPacket *avpkt) { const uint8_t *buf = avpkt->data; VP56Context *s = avctx->priv_data; AVFrame *const p = s->frames[VP56_FRAME_CURRENT]; int remaining_buf_size = avpkt->size; int av_uninit(alpha_offset); int i, res; int ret; if (s->has_alpha) { if (remaining_buf_size < 3) return AVERROR_INVALIDDATA; alpha_offset = bytestream_get_be24(&buf); remaining_buf_size -= 3; if (remaining_buf_size < alpha_offset) return AVERROR_INVALIDDATA; } res = s->parse_header(s, buf, remaining_buf_size); if (res < 0) return res; if (res == VP56_SIZE_CHANGE) { for (i = 0; i < 4; i++) { av_frame_unref(s->frames[i]); if (s->alpha_context) av_frame_unref(s->alpha_context->frames[i]); } } ret = ff_get_buffer(avctx, p, AV_GET_BUFFER_FLAG_REF); if (ret < 0) return ret; if (avctx->pix_fmt == AV_PIX_FMT_YUVA420P) { av_frame_unref(s->alpha_context->frames[VP56_FRAME_CURRENT]); if ((ret = av_frame_ref(s->alpha_context->frames[VP56_FRAME_CURRENT], p)) < 0) { av_frame_unref(p); return ret; } } if (res == VP56_SIZE_CHANGE) { if (vp56_size_changed(s)) { av_frame_unref(p); return AVERROR_INVALIDDATA; } } if (avctx->pix_fmt == AV_PIX_FMT_YUVA420P) { int bak_w = avctx->width; int bak_h = avctx->height; int bak_cw = avctx->coded_width; int bak_ch = avctx->coded_height; buf += alpha_offset; remaining_buf_size -= alpha_offset; res = s->alpha_context->parse_header(s->alpha_context, buf, remaining_buf_size); if (res != 0) { if(res==VP56_SIZE_CHANGE) { av_log(avctx, AV_LOG_ERROR, "Alpha reconfiguration\n"); avctx->width = bak_w; avctx->height = bak_h; avctx->coded_width = bak_cw; avctx->coded_height = bak_ch; } av_frame_unref(p); return AVERROR_INVALIDDATA; } } avctx->execute2(avctx, ff_vp56_decode_mbs, 0, 0, (avctx->pix_fmt == AV_PIX_FMT_YUVA420P) + 1); if ((res = av_frame_ref(data, p)) < 0) return res; *got_frame = 1; return avpkt->size; }
1threat
Issue of reading input from filepointer : I'm trying to do a DFS search on a given graph using the following code: #include <iostream> #include <cstring> #include <cstdlib> #include <cstdio> #include <vector> #include <queue> typedef struct G{ int vertex1; int vertex2; float num; } graph; typedef struct adj{ std::vector<int> element; }adj; void dfs (int v, bool marked[], adj*p){ marked[v]=true; std::vector<int>::iterator i; for (i=p[v].element.begin(); i!=p[v].element.end();i++){ if (!marked[*i]){ dfs(*i, marked, p); } } } void Search(adj*p, int*tvertex){ bool *marked=new bool[*tvertex]; for (int v=0; v<*tvertex; v++){ marked[v]=false; q.push(v); } for (int v=0; v<*tvertex;v++){ if (marked[v]==false){ dfs(v, marked,p); while (!q.empty()){ bfs(v, marked,p, q); } } } } void buildadj(graph*g, adj*p, int * tvertex, int *edge ){ for (int e=0; e<*edge; e++){ p[g[e].vertex1].element.push_back(g[e].vertex2); p[g[e].vertex2].element.push_back(g[e].vertex1); } } void readInData(FILE *fp, graph*g, int *tvertex) { char buffer[500]; char *token; const char delimiters[] = " "; int i; int n; memset(buffer, 0, 499); for(i = 0;!feof(fp);) { i++; if (i>=2){ fscanf(fp, " %[^\n]", buffer); token = strtok(buffer, delimiters); n = (int) atoi(token); g[i-2].vertex1 = n; g[i-2].vertex2 = (int) atoi(strtok(NULL, delimiters)); g[i-2].num = (float)atof(strtok(NULL, delimiters)); } } } void readstrct(FILE *fp,int*edge, int*tvertex){ int i; int a[2]; while (EOF!=fscanf(fp, "%d\n", &a[i])) { i++; if(i>=2){ break; } } *tvertex=a[0]; *edge=a[1]; } void sendMessage() { char message[200]; sprintf(message, "Needs to be in this format:\n./exe_name NAME.txt\n"); printf("%s", message); } int main(int argc, char * argv[]) { FILE *fp; int edge; int tvertex; if(argc < 2) { printf("File not given\n"); sendMessage(); return 0; } fp=fopen(argv[1], "r"); if(fp == NULL) { printf("file not found\n"); sendMessage(); return 0; } readstrct(fp,&edge, &tvertex); graph *g=new graph[edge]; adj *p=new adj[tvertex]; readInData(fp, g, &tvertex); buildadj(g,p,&tvertex, &edge); Search(p,&tvertex); } The input is of the following form: 13 13 0 5 2.1 4 3 2.3 0 1 3.2 9 12 4.2 6 4 5.1 5 4 2.2 0 2 0.2 11 12 0.22 9 10 0.22 0 6 0.22 7 8 0.22 9 11 0.22 5 3 0.22 I intend to read the first two lines and returns the edges and vertex num in the 'readstruct' function. Line 3 and Line 15 are read in the readIndata function. The code is compling fine using g++. But gives segmentation fault when reading input. I tried to use gdb to debug and found that the code continues to read the file when it reaches line 15(or i=13) in readIndata function. Best
0debug
How to use TypeScript in a Custom Test Environment file in Jest? : <p>I need to enable some global variables to be reachable for my test so I am setting up a Custom Environment to be used in the <code>testEnvironment</code> option in my <code>jest.config.json</code> to achieve that.</p> <p>For our project we have a TypeScript file that we use for <code>setupFilesAfterEnv</code> option and that works just fine, however the <code>testEnvironment</code> seems to support only ES5. Is there any way to use TypeScript in such option?</p> <p>I successfully created a Custom Jest Environment using ES5 syntax, however since we are injecting global variables I need TypeScript to also declare a global namespace see: <a href="https://stackoverflow.com/a/42304473/4655076">https://stackoverflow.com/a/42304473/4655076</a>. </p> <pre class="lang-js prettyprint-override"><code>{ ///... setupFilesAfterEnv: ['&lt;rootDir&gt;/test/setup.ts'], // This works with ts testEnvironment: '&lt;rootDir&gt;/test/customJestEnvironment.ts', // This doesn't work with ts } </code></pre>
0debug
static int imc_decode_block(AVCodecContext *avctx, IMCContext *q, int ch) { int stream_format_code; int imc_hdr, i, j, ret; int flag; int bits, summer; int counter, bitscount; IMCChannel *chctx = q->chctx + ch; imc_hdr = get_bits(&q->gb, 9); if (imc_hdr & 0x18) { av_log(avctx, AV_LOG_ERROR, "frame header check failed!\n"); av_log(avctx, AV_LOG_ERROR, "got %X.\n", imc_hdr); stream_format_code = get_bits(&q->gb, 3); if (stream_format_code & 1) { av_log_ask_for_sample(avctx, "Stream format %X is not supported\n", stream_format_code); return AVERROR_PATCHWELCOME; if (stream_format_code & 0x04) chctx->decoder_reset = 1; if (chctx->decoder_reset) { for (i = 0; i < BANDS; i++) chctx->old_floor[i] = 1.0; for (i = 0; i < COEFFS; i++) chctx->CWdecoded[i] = 0; chctx->decoder_reset = 0; flag = get_bits1(&q->gb); imc_read_level_coeffs(q, stream_format_code, chctx->levlCoeffBuf); if (stream_format_code & 0x4) imc_decode_level_coefficients(q, chctx->levlCoeffBuf, chctx->flcoeffs1, chctx->flcoeffs2); else imc_decode_level_coefficients2(q, chctx->levlCoeffBuf, chctx->old_floor, chctx->flcoeffs1, chctx->flcoeffs2); memcpy(chctx->old_floor, chctx->flcoeffs1, 32 * sizeof(float)); counter = 0; for (i = 0; i < BANDS; i++) { if (chctx->levlCoeffBuf[i] == 16) { chctx->bandWidthT[i] = 0; counter++; } else chctx->bandWidthT[i] = band_tab[i + 1] - band_tab[i]; memset(chctx->bandFlagsBuf, 0, BANDS * sizeof(int)); for (i = 0; i < BANDS - 1; i++) { if (chctx->bandWidthT[i]) chctx->bandFlagsBuf[i] = get_bits1(&q->gb); imc_calculate_coeffs(q, chctx->flcoeffs1, chctx->flcoeffs2, chctx->bandWidthT, chctx->flcoeffs3, chctx->flcoeffs5); bitscount = 0; if (stream_format_code & 0x2) { bitscount += 15; chctx->bitsBandT[0] = 5; chctx->CWlengthT[0] = 5; chctx->CWlengthT[1] = 5; chctx->CWlengthT[2] = 5; for (i = 1; i < 4; i++) { bits = (chctx->levlCoeffBuf[i] == 16) ? 0 : 5; chctx->bitsBandT[i] = bits; for (j = band_tab[i]; j < band_tab[i + 1]; j++) { chctx->CWlengthT[j] = bits; bitscount += bits; if (avctx->codec_id == AV_CODEC_ID_IAC) { bitscount += !!chctx->bandWidthT[BANDS - 1]; if (!(stream_format_code & 0x2)) bitscount += 16; if ((ret = bit_allocation(q, chctx, stream_format_code, 512 - bitscount - get_bits_count(&q->gb), flag)) < 0) { av_log(avctx, AV_LOG_ERROR, "Bit allocations failed\n"); chctx->decoder_reset = 1; return ret; for (i = 0; i < BANDS; i++) { chctx->sumLenArr[i] = 0; chctx->skipFlagRaw[i] = 0; for (j = band_tab[i]; j < band_tab[i + 1]; j++) chctx->sumLenArr[i] += chctx->CWlengthT[j]; if (chctx->bandFlagsBuf[i]) if ((((band_tab[i + 1] - band_tab[i]) * 1.5) > chctx->sumLenArr[i]) && (chctx->sumLenArr[i] > 0)) chctx->skipFlagRaw[i] = 1; imc_get_skip_coeff(q, chctx); for (i = 0; i < BANDS; i++) { chctx->flcoeffs6[i] = chctx->flcoeffs1[i]; if (chctx->bandFlagsBuf[i] && (band_tab[i + 1] - band_tab[i]) != chctx->skipFlagCount[i]) { chctx->flcoeffs6[i] *= q->sqrt_tab[ band_tab[i + 1] - band_tab[i]] / q->sqrt_tab[(band_tab[i + 1] - band_tab[i] - chctx->skipFlagCount[i])]; bits = summer = 0; for (i = 0; i < BANDS; i++) { if (chctx->bandFlagsBuf[i]) { for (j = band_tab[i]; j < band_tab[i + 1]; j++) { if (chctx->skipFlags[j]) { summer += chctx->CWlengthT[j]; chctx->CWlengthT[j] = 0; bits += chctx->skipFlagBits[i]; summer -= chctx->skipFlagBits[i]; imc_adjust_bit_allocation(q, chctx, summer); for (i = 0; i < BANDS; i++) { chctx->sumLenArr[i] = 0; for (j = band_tab[i]; j < band_tab[i + 1]; j++) if (!chctx->skipFlags[j]) chctx->sumLenArr[i] += chctx->CWlengthT[j]; memset(chctx->codewords, 0, sizeof(chctx->codewords)); if (imc_get_coeffs(q, chctx) < 0) { av_log(avctx, AV_LOG_ERROR, "Read coefficients failed\n"); chctx->decoder_reset = 1; if (inverse_quant_coeff(q, chctx, stream_format_code) < 0) { av_log(avctx, AV_LOG_ERROR, "Inverse quantization of coefficients failed\n"); chctx->decoder_reset = 1; memset(chctx->skipFlags, 0, sizeof(chctx->skipFlags)); imc_imdct256(q, chctx, avctx->channels); return 0;
1threat
Why do i convert map to json, map includes list values which is nothing after converting to json : func Test_JsonTtransfer(t *testing.T) { uid := "306" phoneList := list.New() phoneList.PushBack("18513622928") fmt.Println("phoneList=======", phoneList.Len()) jsonPhoneList, err := json.Marshal(phoneList) if err != nil { fmt.Println("error:", err) } fmt.Println("jsonPhoneList=======", string(jsonPhoneList)) idCardList := list.New() idCardList.PushBack("230405197608040640") request := make(map[string]interface{}) request["uid"] = uid request["phones"] = phoneList request["id_cards"] = idCardList json, err := json.Marshal(request) if err != nil { fmt.Println("error:", err) } fmt.Println("json=======", json) fmt.Println("json=======", string(json)) } output: D:/Sys/server/Go\bin\go.exe test -v golang-test/com/http/test -run ^Test_JsonTtransfer$ phoneList======= 1 jsonPhoneList======= {} json======= [123 34 105 100 95 99 97 114 100 115 34 58 123 125 44 34 112 104 111 110 101 115 34 58 123 125 44 34 117 105 100 34 58 34 51 48 54 34 125] json======= {"id_cards":{},"phones":{},"uid":"306"} ok golang-test/com/http/test 0.482s phones should be list values, but nothing. help me
0debug
What should I learn for getting started in AI programming? : <p>Which language should I learn? I just got started in programming. I know HTML, CSS, some JavaScript and I just started learning python. Thanks for anything that helps.</p>
0debug
void tcg_exec_init(unsigned long tb_size) { cpu_gen_init(); code_gen_alloc(tb_size); page_init(); #if defined(CONFIG_SOFTMMU) tcg_prologue_init(&tcg_ctx); #endif }
1threat
static void tcg_out_br(TCGContext *s, int label_index) { TCGLabel *l = &s->labels[label_index]; uint64_t imm; if (l->has_value) { imm = l->u.value_ptr - s->code_ptr; } else { imm = get_reloc_pcrel21b_slot2(s->code_ptr); tcg_out_reloc(s, s->code_ptr, R_IA64_PCREL21B, label_index, 0); } tcg_out_bundle(s, mmB, INSN_NOP_M, INSN_NOP_M, tcg_opc_b1(TCG_REG_P0, OPC_BR_SPTK_MANY_B1, imm)); }
1threat
static inline void RENAME(rgb15tobgr24)(const uint8_t *src, uint8_t *dst, long src_size) { const uint16_t *end; #if COMPILE_TEMPLATE_MMX const uint16_t *mm_end; #endif uint8_t *d = dst; const uint16_t *s = (const uint16_t*)src; end = s + src_size/2; #if COMPILE_TEMPLATE_MMX __asm__ volatile(PREFETCH" %0"::"m"(*s):"memory"); mm_end = end - 7; while (s < mm_end) { __asm__ volatile( PREFETCH" 32%1 \n\t" "movq %1, %%mm0 \n\t" "movq %1, %%mm1 \n\t" "movq %1, %%mm2 \n\t" "pand %2, %%mm0 \n\t" "pand %3, %%mm1 \n\t" "pand %4, %%mm2 \n\t" "psllq $3, %%mm0 \n\t" "psrlq $2, %%mm1 \n\t" "psrlq $7, %%mm2 \n\t" "movq %%mm0, %%mm3 \n\t" "movq %%mm1, %%mm4 \n\t" "movq %%mm2, %%mm5 \n\t" "punpcklwd %5, %%mm0 \n\t" "punpcklwd %5, %%mm1 \n\t" "punpcklwd %5, %%mm2 \n\t" "punpckhwd %5, %%mm3 \n\t" "punpckhwd %5, %%mm4 \n\t" "punpckhwd %5, %%mm5 \n\t" "psllq $8, %%mm1 \n\t" "psllq $16, %%mm2 \n\t" "por %%mm1, %%mm0 \n\t" "por %%mm2, %%mm0 \n\t" "psllq $8, %%mm4 \n\t" "psllq $16, %%mm5 \n\t" "por %%mm4, %%mm3 \n\t" "por %%mm5, %%mm3 \n\t" "movq %%mm0, %%mm6 \n\t" "movq %%mm3, %%mm7 \n\t" "movq 8%1, %%mm0 \n\t" "movq 8%1, %%mm1 \n\t" "movq 8%1, %%mm2 \n\t" "pand %2, %%mm0 \n\t" "pand %3, %%mm1 \n\t" "pand %4, %%mm2 \n\t" "psllq $3, %%mm0 \n\t" "psrlq $2, %%mm1 \n\t" "psrlq $7, %%mm2 \n\t" "movq %%mm0, %%mm3 \n\t" "movq %%mm1, %%mm4 \n\t" "movq %%mm2, %%mm5 \n\t" "punpcklwd %5, %%mm0 \n\t" "punpcklwd %5, %%mm1 \n\t" "punpcklwd %5, %%mm2 \n\t" "punpckhwd %5, %%mm3 \n\t" "punpckhwd %5, %%mm4 \n\t" "punpckhwd %5, %%mm5 \n\t" "psllq $8, %%mm1 \n\t" "psllq $16, %%mm2 \n\t" "por %%mm1, %%mm0 \n\t" "por %%mm2, %%mm0 \n\t" "psllq $8, %%mm4 \n\t" "psllq $16, %%mm5 \n\t" "por %%mm4, %%mm3 \n\t" "por %%mm5, %%mm3 \n\t" :"=m"(*d) :"m"(*s),"m"(mask15b),"m"(mask15g),"m"(mask15r), "m"(mmx_null) :"memory"); __asm__ volatile( "movq %%mm0, %%mm4 \n\t" "movq %%mm3, %%mm5 \n\t" "movq %%mm6, %%mm0 \n\t" "movq %%mm7, %%mm1 \n\t" "movq %%mm4, %%mm6 \n\t" "movq %%mm5, %%mm7 \n\t" "movq %%mm0, %%mm2 \n\t" "movq %%mm1, %%mm3 \n\t" STORE_BGR24_MMX :"=m"(*d) :"m"(*s) :"memory"); d += 24; s += 8; } __asm__ volatile(SFENCE:::"memory"); __asm__ volatile(EMMS:::"memory"); #endif while (s < end) { register uint16_t bgr; bgr = *s++; *d++ = (bgr&0x1F)<<3; *d++ = (bgr&0x3E0)>>2; *d++ = (bgr&0x7C00)>>7; } }
1threat
static void bus_set_realized(Object *obj, bool value, Error **errp) { BusState *bus = BUS(obj); BusClass *bc = BUS_GET_CLASS(bus); Error *local_err = NULL; if (value && !bus->realized) { if (bc->realize) { bc->realize(bus, &local_err); } } else if (!value && bus->realized) { if (bc->unrealize) { bc->unrealize(bus, &local_err); } } if (local_err != NULL) { error_propagate(errp, local_err); return; } bus->realized = value; }
1threat
Transparent text color on white background showing elements in the back : <p>How can I create a color transparent text so I can show the elements seen in the back?</p> <p>This is the example on how the text will look like:</p> <p><a href="https://www.screencast.com/t/QIbugOtUjpXo" rel="nofollow noreferrer">https://www.screencast.com/t/QIbugOtUjpXo</a></p>
0debug
array implementation using c++ STL : <p>I have tried these two cannot understand the difference</p> <pre><code> vector&lt;int &gt;a(n) and vector&lt;int &gt;a[n] </code></pre> <p>Please someone explain.thanks</p>
0debug
static void do_interrupt_user(CPUX86State *env, int intno, int is_int, int error_code, target_ulong next_eip) { SegmentCache *dt; target_ulong ptr; int dpl, cpl, shift; uint32_t e2; dt = &env->idt; if (env->hflags & HF_LMA_MASK) { shift = 4; } else { shift = 3; } ptr = dt->base + (intno << shift); e2 = cpu_ldl_kernel(env, ptr + 4); dpl = (e2 >> DESC_DPL_SHIFT) & 3; cpl = env->hflags & HF_CPL_MASK; if (is_int && dpl < cpl) { raise_exception_err(env, EXCP0D_GPF, (intno << shift) + 2); } if (is_int || intno == EXCP_SYSCALL) { env->eip = next_eip; } }
1threat
Is there a way of not hard coding sql queries in my express app? : <p>So I am making an express app for a project and I am using mysql as the db, I had previously hard coded my sql queries in my code, but it is bad practice to do so according to my teacher, so I was looking for a way in which I don't have to hard code. </p> <p>I have tried queries like find and findAll using mongoose, I am looking for something similar.</p> <pre><code>app.get("/shops/:id",function(req,res){ var id = req.params.id; connection.query('SELECT * FROM `menuItems` WHERE shop_id =?',[id],function(error,results,fields) { console.log(results); res.json(JSON.stringify(results)); }) }); </code></pre> <p>I don't want to hard code my queries as I have in the above code snippet.</p> <p>Kindly help me get a way for not hard coding sql queries in my app. Thanks in advance.</p>
0debug
Swift class extensions and categories on Swift classes are not allowed to have +load methods : <p>I have updated Xcode Version 10.2 (10E125) and testing on devices (not only simulator) </p> <p>I get this message when I execute the app:</p> <p>objc[3297]: Swift class extensions and categories on Swift classes are not allowed to have +load methods</p> <ul> <li>It's just not working on devices with iOS 12.2. I would like to know if there was any update that was affecting the swift classes. So far no answer about this in other forums just saw that apple has some issues with other apps in production as well.</li> </ul> <p>-I'm using extensions of swift classes but I don't think that is the problem</p> <ul> <li><p>Using Cocoapods and Firebase dependencies.</p></li> <li><p>I searched in my project any functions that could contain "load" functions, none found.</p></li> </ul> <p>Please some help</p>
0debug
static void RENAME(extract_odd2)(const uint8_t *src, uint8_t *dst0, uint8_t *dst1, x86_reg count) { dst0+= count; dst1+= count; src += 4*count; count= - count; #if COMPILE_TEMPLATE_MMX if(count <= -8) { count += 7; __asm__ volatile( "pcmpeqw %%mm7, %%mm7 \n\t" "psrlw $8, %%mm7 \n\t" "1: \n\t" "movq -28(%1, %0, 4), %%mm0 \n\t" "movq -20(%1, %0, 4), %%mm1 \n\t" "movq -12(%1, %0, 4), %%mm2 \n\t" "movq -4(%1, %0, 4), %%mm3 \n\t" "psrlw $8, %%mm0 \n\t" "psrlw $8, %%mm1 \n\t" "psrlw $8, %%mm2 \n\t" "psrlw $8, %%mm3 \n\t" "packuswb %%mm1, %%mm0 \n\t" "packuswb %%mm3, %%mm2 \n\t" "movq %%mm0, %%mm1 \n\t" "movq %%mm2, %%mm3 \n\t" "psrlw $8, %%mm0 \n\t" "psrlw $8, %%mm2 \n\t" "pand %%mm7, %%mm1 \n\t" "pand %%mm7, %%mm3 \n\t" "packuswb %%mm2, %%mm0 \n\t" "packuswb %%mm3, %%mm1 \n\t" MOVNTQ" %%mm0,- 7(%3, %0) \n\t" MOVNTQ" %%mm1,- 7(%2, %0) \n\t" "add $8, %0 \n\t" " js 1b \n\t" : "+r"(count) : "r"(src), "r"(dst0), "r"(dst1) ); count -= 7; } #endif src++; while(count<0) { dst0[count]= src[4*count+0]; dst1[count]= src[4*count+2]; count++; } }
1threat
How do you set a cell to null in datagrip? : <p>Is there a quick way without diving into sql to set a particular attribute back to null? entering in "" doesn't work. </p>
0debug
why does '10' is less than '7' or less while we are sorting? : >>> '10'>'3' False >>> >>> a=['10','9','8','7'] >>> a.sort() >>> a ['10', '7', '8', '9'] >>> why '10' is less than '3' ? I tried with several more values but the same thing is happening with it. >>> '10'>'3' False >>> >>> a=['10','9','8','7'] >>> a.sort() >>> a ['10', '7', '8', '9'] >>> I expect the output of '10'>'3' to be True
0debug
static inline void tlb_reset_dirty_range(CPUTLBEntry *tlb_entry, unsigned long start, unsigned long length) { unsigned long addr; if ((tlb_entry->addr_write & ~TARGET_PAGE_MASK) == io_mem_ram.ram_addr) { addr = (tlb_entry->addr_write & TARGET_PAGE_MASK) + tlb_entry->addend; if ((addr - start) < length) { tlb_entry->addr_write = (tlb_entry->addr_write & TARGET_PAGE_MASK) | TLB_NOTDIRTY; } } }
1threat
I an new to jquery can anyone tell how to create canvas element using jquery : I am doing this in between html code [enter image description here][1] [1]: https://i.stack.imgur.com/lAFHy.png
0debug
How to have multiple goroutines read the lines of a single file? : <p>I want to read a huge file, say > 1 GB, and have its lines processed by multiple worker goroutines.</p> <p>I'm worried that using a single goroutine (main) for reading the input lines will impose a bottleneck, when using a huge number of worker goroutines.</p> <p>How can I safely have multiple goroutines read the lines of the file? Is it possible split the input file into several chuncks and have each goroutine operate on a separate chunk individually?</p> <p>The following is sample code of having one goroutine read input lines with several worker goroutines processing them:</p> <pre><code>package main import ( "bufio" "fmt" "log" "os" ) func main() { file, err := os.Open("/path/to/file.txt") if err != nil { log.Fatal(err) } defer file.Close() lines := make(chan string) for i := 0; i &lt; 100; i++ { // start 100 workers to process input lines. // the workers terminate once 'lines' is closed. go worker(lines) } scanner := bufio.NewScanner(file) go func() { defer close(lines) for scanner.Scan() { lines &lt;- scanner.Text() } if err := scanner.Err(); err != nil { log.Fatal(err) } }() ... } </code></pre>
0debug
How to change date format in Ruby : I'm stuck on a question. How to return the incoming dates with the valid formats? I want to change_date_format(["2010/03/30","15/12/2016","11-15-2012, 20130720"] should return the list ["20100330","2016215", "2012115"] def change_date_format(dates) return [] end p change_date_format(["2010/03/30", "15/12/2016", "11-15-2012", "20130720"])
0debug
JavaScript alert not showing up in the browser page but at the right hand side of screen : <p>Visual Studio Code must have had an update. I wasn't able to find anything on their website but I wasn't seeing my alert I created like it used to appear on the browser page and then I noticed that the alert is at the bottom right hand of the screen outside of the browser page but inside the Visual Studio Code. Is there any way to make this show up in the browser preview using extension for Live Server?<a href="https://i.stack.imgur.com/aS6Mh.jpg" rel="nofollow noreferrer">See attached screen shot - scroll to the bottom right to see the alert.</a></p>
0debug
How to create unique keys for React elements? : <p>I am making a React app that allows you to make a list and save it, but React has been giving me a warning that my elements don't have a unique key prop (elements List/ListForm). How should I create a unique key prop for user created elements? Below is my React code</p> <pre><code>var TitleForm = React.createClass({ handleSubmit: function(e) { e.preventDefault(); var listName = {'name':this.refs.listName.value}; this.props.handleCreate(listName); this.refs.listName.value = ""; }, render: function() { return ( &lt;div&gt; &lt;form onSubmit={this.handleSubmit}&gt; &lt;input className='form-control list-input' type='text' ref='listName' placeholder="List Name"/&gt; &lt;br/&gt; &lt;button className="btn btn-primary" type="submit"&gt;Create&lt;/button&gt; &lt;/form&gt; &lt;/div&gt; ); } }); var ListForm = React.createClass({ getInitialState: function() { return {items:[{'name':'item1'}],itemCount:1}; }, handleSubmit: function(e) { e.preventDefault(); var list = {'name': this.props.name, 'data':[]}; var items = this.state.items; for (var i = 1; i &lt; items.length; i++) { list.data.push(this.refs[items[i].name]); } this.props.update(list); $('#'+this.props.name).remove(); }, handleClick: function() { this.setState({ items: this.state.items.concat({'name':'item'+this.state.itemCount+1}), itemCount: this.state.itemCount+1 }); }, handleDelete: function() { this.setState({ itemCount: this.state.itemCount-1 }); }, render: function() { var listItems = this.state.items.map(function(item) { return ( &lt;div&gt; &lt;input type="text" className="list-form" placeholder="List Item" ref={item.name}/&gt; &lt;br/&gt; &lt;/div&gt; ); }); return ( &lt;div&gt; &lt;form onSubmit={this.handleSubmit} className="well list-form-container"&gt; {listItems} &lt;br/&gt; &lt;div onClick={this.handleClick} className="btn btn-primary list-button"&gt;Add&lt;/div&gt; &lt;div onClick={this.handleDelete} className="btn btn-primary list-button"&gt;Delete&lt;/div&gt; &lt;button type="submit" className="btn btn-primary list-button"&gt;Save&lt;/button&gt; &lt;/form&gt; &lt;/div&gt; ) } }); var List = React.createClass({ getInitialState: function() { return {lists:[], savedLists: []}; }, handleCreate: function(listName) { this.setState({ lists: this.state.lists.concat(listName) }); }, updateSaved: function(list) { this.setState({ savedLists: this.state.savedLists.concat(list) }); }, render: function() { var lst = this; var lists = this.state.lists.map(function(list) { return( &lt;div&gt; &lt;div key={list.name} id={list.name}&gt; &lt;h2 key={"header"+list.name}&gt;{list.name}&lt;/h2&gt; &lt;ListForm update={lst.updateSaved} name={list.name}/&gt; &lt;/div&gt; &lt;/div&gt; ) }); var savedLists = this.state.savedLists.map(function(list) { var list_data = list.data; list_data.map(function(data) { return ( &lt;li&gt;{data}&lt;/li&gt; ) }); return( &lt;div&gt; &lt;h2&gt;{list.name}&lt;/h2&gt; &lt;ul&gt; {list_data} &lt;/ul&gt; &lt;/div&gt; ) }); var save_msg; if(savedLists.length == 0){ save_msg = 'No Saved Lists'; }else{ save_msg = 'Saved Lists'; } return ( &lt;div&gt; &lt;TitleForm handleCreate={this.handleCreate} /&gt; {lists} &lt;h2&gt;{save_msg}&lt;/h2&gt; {savedLists} &lt;/div&gt; ) } }); ReactDOM.render(&lt;List/&gt;,document.getElementById('app')); </code></pre> <p>My HTML:</p> <pre><code>&lt;div class="container"&gt; &lt;h1&gt;Title&lt;/h1&gt; &lt;div id="app" class="center"&gt;&lt;/div&gt; &lt;/div&gt; </code></pre>
0debug
int qemu_ram_addr_from_host(void *ptr, ram_addr_t *ram_addr) { RAMBlock *block; uint8_t *host = ptr; if (xen_enabled()) { *ram_addr = xen_ram_addr_from_mapcache(ptr); return 0; } QTAILQ_FOREACH(block, &ram_list.blocks, next) { if (block->host == NULL) { continue; } if (host - block->host < block->length) { *ram_addr = block->offset + (host - block->host); return 0; } } return -1; }
1threat
How to delete last found value in Bash : <p>Say I have a string <code>0.0.25</code>, how do I delete the last part after dot (including it) to make it like <code>0.0</code>? Note that last part can have variable number of digits. </p>
0debug
How you generate create query from existing table using java? : <p>I have few tables I want to generate create query(ddl) from existing tables , and convert to postgres compatible one from oracle .</p>
0debug
How to extract .tar.gz file : I want to extract an archive named kdiff3.tar.gz. Using tar -xzvf filename.tar.gz doesn't extract the file. it is gives this error: gzip: stdin: not in gzip format tar: Child returned status 1 tar: Error exit delayed from previous errors file kdiff3.tar.gz gives this message: kdiff3.tar.gz: HTML document, ASCII text It would be great, if someone could help.
0debug
A short C++ program about the pointer......PLZ : #include <iostream> using namespace std; int main() { int x[][3]={1,2,3,4,5}; cout<<&x <<" "<<*x <<" "<<x <<endl; cout<<&x[0]<<" "<<*x[0]<<" "<<x[0]<<endl; cout<<&x[0][0]<<endl; return 0; } The result is: 0x28fef8 0x28fef8 0x28fef8 0x28fef8 1 0x28fef8 0x28fef8 why x[0][0] and x are in the this pointer?? what is in the 0x28fef8? 1 or 0x28fef8
0debug
How to replace "\" with "/" in python? i'm getting an EOL error : <p>How can i replace ("\") with ("/") in python?</p> <pre><code>a = ("Hello/World") b = a.replace("/","\") print(b) </code></pre> <p>i was expected that i'll successfully replace but i'm getting an error EOL error</p>
0debug
Tail Call Optimization implementation in Javascript Engines : <p>As of February 2019 in Chrome Version <code>71.0.3578.98</code> on Mac , the following program throws <code>Uncaught RangeError: Maximum call stack size exceeded error.</code> at a count of <code>16516</code>.</p> <pre><code>const a = x =&gt; { console.log(x) a(x + 1) } a(1) </code></pre> <p>I've done quite a bit of Googling, but wasn't able to find any articles discussing Chrome or other browser support for Tail Call Optimization (TCO) or any future plans to implement it. </p> <p>My two questions are:</p> <ol> <li>Is TCO currently supported in Chrome or any other browser or Javascript Engine</li> <li>Are there plans to implement TCO in the near future in any Javascript Engine</li> </ol> <p>The posts that I've found are mostly old (2016 or earlier) or simply confusing. e.g. <a href="https://www.chromestatus.com/feature/5516876633341952" rel="noreferrer">https://www.chromestatus.com/feature/5516876633341952</a></p>
0debug
static uint32_t drc_set_usable(sPAPRDRConnector *drc) { if (!drc->dev) { return RTAS_OUT_NO_SUCH_INDICATOR; } if (drc->awaiting_release) { return RTAS_OUT_NO_SUCH_INDICATOR; } drc->allocation_state = SPAPR_DR_ALLOCATION_STATE_USABLE; return RTAS_OUT_SUCCESS; }
1threat
void sd_set_cb(SDState *sd, qemu_irq readonly, qemu_irq insert) { sd->readonly_cb = readonly; sd->inserted_cb = insert; qemu_set_irq(readonly, sd->bdrv ? bdrv_is_read_only(sd->bdrv) : 0); qemu_set_irq(insert, sd->bdrv ? bdrv_is_inserted(sd->bdrv) : 0); }
1threat
Sample Projects about the backup of Sqlite file and photos on Google Drive programmatically in Android : <p>I am writting an app with an instant messenger function. Users would write text or share images in the IM function. The <strong>text data and file name of images are stored in the sqlite file while the images are stored in the device</strong>.</p> <p>Since my server will not keep the data, the <strong>user would not get the chat record when he / she switches to a new device</strong>. With reference to whatsapp, they allow the users to back up the chat records and images to Google Drive on a regular basis and get the data back from the drive as shown in the picture below.</p> <p><a href="https://i.stack.imgur.com/eTJug.png" rel="noreferrer"><img src="https://i.stack.imgur.com/eTJug.png" alt="enter image description here"></a> When I go to my own google drive, I would find "Whatsapp" is connected to my google drive as shown below.</p> <p><a href="https://i.stack.imgur.com/1gPxB.png" rel="noreferrer"><img src="https://i.stack.imgur.com/1gPxB.png" alt="enter image description here"></a> I am new to the Google Drive API in Android and I would like to backup my sqlite database and images in the same way as Whatsapp does. I know there are few questions related to the back-up of sqlite database to Google Drive but it seems that it only contains part of the codes and this makes the beginner quite difficult to understand.</p> <p>Will there be any <strong>sample projects</strong> on github, questions on stackoverflow or tutorials that would allow me learn how to <strong>back up sqlite database and images programmatically in Android</strong> step by step?</p> <p>In additions, I am surprised to see <strong>only whatsapp is connected to my Google Drive</strong> but no more other apps, so I don't know if third-party developers would have the same access to Google Drive and complete the backup in the same way as Whatsapp.</p>
0debug
Keep getting ParseException: Unparseable date : <p>I am developing a simple android app and I need convert a string which contains date and time to date in java, android. Here is an example format of my string:</p> <pre><code>Sun May 20 18:07:13 EEST 2018 </code></pre> <p>And here is how I try to convert it to date:</p> <pre><code>SimpleDateFormat formatter = new SimpleDateFormat("EEE MMM dd HH:mm:ss zzz yyyy"); Date date = formatter.parse("Sun May 20 18:07:13 EEST 2018"); </code></pre> <p>This is the error message I get. Where am I doing wrong??</p> <pre><code>W/System.err: java.text.ParseException: Unparseable date: "Sun May 20 18:07:13 EEST 2018" (at offset 0) </code></pre>
0debug
Python Seaborn jointplot does not show the correlation coefficient and p-value on the chart : <p>I'm trying to plot jointplot with below and from samples I saw it should show the correlation coefficient and p-value on the chart. However it does not show those values on mine. Any advice? thanks.</p> <pre><code>import seaborn as sns sns.set(style="darkgrid", color_codes=True) sns.jointplot('Num of A', ' Ratio B', data = data_df, kind='reg', height=8) plt.show() </code></pre>
0debug
Laravel 5.3 Passport Custom Grants? : <p>I know I am not the only person who has come up to this point. Does anyone know how to <strong>properly</strong> implement a custom grant in Laravel(5.3) Passport? </p> <p>Or</p> <p>Have a good link/tutorial to reference how to properly do it?</p> <p>I know there's this package:</p> <p><a href="https://github.com/mikemclin/passport-custom-request-grant">https://github.com/mikemclin/passport-custom-request-grant</a></p> <p>But I'm asking for a more "Do it yourself" approach.</p> <p>Thank you in advance.</p>
0debug
R applying t.test comparing multiple group and exprot result as a table : <p>I have a data look like:</p> <pre><code>&gt; Group Time Result &gt; A 1 933 &gt; A 2 992 &gt; A 3 1007 &gt; A 4 1428 &gt; A 5 1068 &gt; A 6 721 &gt; A 1 1175 &gt; A 2 900 &gt; A 3 875 &gt; A 4 1775 &gt; A 5 986 &gt; A 6 963 &gt; A 1 1394 &gt; A 2 958 &gt; A 3 919 &gt; A 4 1103 &gt; A 5 940 &gt; A 6 919 &gt; C 2 1127 &gt; C 3 990 &gt; C 4 1033 &gt; C 5 1073 &gt; C 6 817 &gt; .... </code></pre> <p>and I have total 50 groups, and each grous have 6 time point, so I want to the P value result as a table, using t.test:</p> <pre><code>&gt; Time:1 &gt; A B C D E &gt; A / N N N N &gt; B N / N N N &gt; C N N / N N &gt; D N N N / N &gt; E N N N N / &gt; &gt; Time:2 &gt; A B C D E &gt; A / N N N N &gt; B N / N N N &gt; C N N / N N &gt; D N N N / N &gt; E N N N N / </code></pre> <p>Is this possible?</p> <p>Thank you for your help.</p>
0debug
cursor.execute('SELECT * FROM users WHERE username = ' + user_input)
1threat
Swift Nested Calling : <pre><code>class BFCommon { class BFUser { } extension BFUser { static func sayHello() { print("Hello") } } } BFCommon.BFUser.sayHello </code></pre> <p>Need help fixing my code to call a function like so.</p> <p>BFCommon.BFUser.sayHello</p>
0debug
static QDict *qmp_check_input_obj(QObject *input_obj) { const QDictEntry *ent; int has_exec_key = 0; QDict *input_dict; if (qobject_type(input_obj) != QTYPE_QDICT) { qerror_report(QERR_QMP_BAD_INPUT_OBJECT, "object"); return NULL; } input_dict = qobject_to_qdict(input_obj); for (ent = qdict_first(input_dict); ent; ent = qdict_next(input_dict, ent)){ const char *arg_name = qdict_entry_key(ent); const QObject *arg_obj = qdict_entry_value(ent); if (!strcmp(arg_name, "execute")) { if (qobject_type(arg_obj) != QTYPE_QSTRING) { qerror_report(QERR_QMP_BAD_INPUT_OBJECT_MEMBER, "execute", "string"); return NULL; } has_exec_key = 1; } else if (!strcmp(arg_name, "arguments")) { if (qobject_type(arg_obj) != QTYPE_QDICT) { qerror_report(QERR_QMP_BAD_INPUT_OBJECT_MEMBER, "arguments", "object"); return NULL; } } else if (!strcmp(arg_name, "id")) { } else { qerror_report(QERR_QMP_EXTRA_MEMBER, arg_name); return NULL; } } if (!has_exec_key) { qerror_report(QERR_QMP_BAD_INPUT_OBJECT, "execute"); return NULL; } return input_dict; }
1threat
How to get android running widgets? : <p>I am getting installed widgets but I need to find out only running widgets of home screen. Is there any possible solution.</p> <p>Thanks in advance.</p>
0debug
static void pprint_data(V9fsPDU *pdu, int rx, size_t *offsetp, const char *name) { struct iovec *sg = get_sg(pdu, rx); size_t offset = *offsetp; unsigned int count; int32_t size; int total, i, j; ssize_t len; if (rx) { count = pdu->elem.in_num; } else count = pdu->elem.out_num; }
1threat
how do i sort variables into other variables by ranking in python : so i am trying to make a program to imrove my python skills which is basically a lucky wheel, you get several items which are all ranked by numbers, i have made the items randomly generate but how would i make them print in order? i assume that the sort() method won't be any use in this situation. ``` # to sort: itemrating1, itemrating2, itemrating3 print(toprateditem) print(meditem) print(lowitem) ``` that is basically what i want to do, i hope i explained it well.
0debug
AddDbContext was called with configuration, but the context type 'MyContext' only declares a parameterless constructor? : <p>In the following console application (.Net core 2.0), the <code>scaffold-dbcontext</code> created the following <code>DbContext</code></p> <pre><code>public partial class MyContext : DbContext { public virtual DbSet&lt;Tables&gt; Tables { get; set; } protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder) { if (!optionsBuilder.IsConfigured) { optionsBuilder.UseSqlServer(Program.Conn); } } protected override void OnModelCreating(ModelBuilder modelBuilder) { .... } } </code></pre> <p>In the Main() (<code>static void Main(string[] args)</code>), the following code</p> <pre><code>var services = new ServiceCollection(); var conn = configuration.GetConnectionString("MySource"); services.AddDbContext&lt;MyContext&gt;(o =&gt; o.UseSqlServer(conn)); // Error </code></pre> <p>got the following run-time error? </p> <blockquote> <p>AddDbContext was called with configuration, but the context type 'MyContext' only declares a parameterless constructor. This means that the configuration passed to AddDbContext will never be used</p> </blockquote>
0debug
av_cold void ff_idctdsp_init_x86(IDCTDSPContext *c, AVCodecContext *avctx, unsigned high_bit_depth) { int cpu_flags = av_get_cpu_flags(); if (INLINE_MMX(cpu_flags)) { if (!high_bit_depth && avctx->lowres == 0 && (avctx->idct_algo == FF_IDCT_AUTO || avctx->idct_algo == FF_IDCT_SIMPLEAUTO || avctx->idct_algo == FF_IDCT_SIMPLEMMX)) { c->idct_put = ff_simple_idct_put_mmx; c->idct_add = ff_simple_idct_add_mmx; c->idct = ff_simple_idct_mmx; c->perm_type = FF_IDCT_PERM_SIMPLE; } } if (EXTERNAL_MMX(cpu_flags)) { c->put_signed_pixels_clamped = ff_put_signed_pixels_clamped_mmx; c->put_pixels_clamped = ff_put_pixels_clamped_mmx; c->add_pixels_clamped = ff_add_pixels_clamped_mmx; } if (EXTERNAL_SSE2(cpu_flags)) { c->put_signed_pixels_clamped = ff_put_signed_pixels_clamped_sse2; c->put_pixels_clamped = ff_put_pixels_clamped_sse2; c->add_pixels_clamped = ff_add_pixels_clamped_sse2; } if (ARCH_X86_64 && avctx->bits_per_raw_sample == 10 && avctx->lowres == 0 && (avctx->idct_algo == FF_IDCT_AUTO || avctx->idct_algo == FF_IDCT_SIMPLEAUTO || avctx->idct_algo == FF_IDCT_SIMPLE)) { if (EXTERNAL_SSE2(cpu_flags)) { c->idct_put = ff_simple_idct10_put_sse2; c->idct_add = NULL; c->idct = ff_simple_idct10_sse2; c->perm_type = FF_IDCT_PERM_TRANSPOSE; } if (EXTERNAL_AVX(cpu_flags)) { c->idct_put = ff_simple_idct10_put_avx; c->idct_add = NULL; c->idct = ff_simple_idct10_avx; c->perm_type = FF_IDCT_PERM_TRANSPOSE; } } }
1threat
Pass a function with parameters to a VoidCallback : <p>Is it possible to pass a Function with parameters to a VoidCallback?</p> <p>for example something like this:</p> <pre><code>class MyClass { void doSomething(int i){ } MyOtherClass myOtherClass = new MyOtherClass(doSomething); } class MyOtherClass { final VoidCallback callback(int); MyOtherClass(this.callback); callback(5); } </code></pre>
0debug
How to get last 4 digit here? : I am Using webide for my development. In Text field Automatic Value is coming from SAP Backend system. Please refer Screen Shot. < Label text="{Kostl}"/ > I need to get here Last 4 digit and not leading number. [enter image description here][1] [1]: https://i.stack.imgur.com/wsyBh.png
0debug
Regexp for mentions in html content : I'm trying to do a regexp for catch the mention in a html content. I have a content like this: <div data-user-id="@john">@john</div> I want catch only the mention inside the divs...not inside the "". I've done this regexp: http://regexr.com/3ckv8 ( /[^"]@[a-zA-Z0-9_]*[^"]/g ) That works almost fine...the problem is that it catch also the >< of the div tags. Any suggestion ? Thanks.
0debug
Annotation Processor in IntelliJ and Gradle : <p><strong>tl;dr</strong>: I cannot configure IntelliJ to generate the java files in the same directory as gradle</p> <p>I have a small project which uses the <a href="https://immutables.github.io/" rel="noreferrer">immutables</a> annotation processor. It works as expected in the gradle command line build, but I cannot get IntelliJ to output the generated files to the same directory.</p> <p>The full project is available on <a href="https://gitlab.com/tmtron-immutables/encoding/tree/0.0.1" rel="noreferrer">GitLab</a></p> <p><strong>Gradle config</strong>:<br> I use the folowing gradle plugins:</p> <ul> <li><a href="https://docs.gradle.org/current/userguide/idea_plugin.html" rel="noreferrer">gradle-idea plugin</a> which handles the idea configuration</li> <li><a href="https://github.com/tbroyer/gradle-apt-plugin" rel="noreferrer">gradle-apt-plugin</a> which provides the apt configuration and handles the compile-class path and idea config related to annotation processing (if also the idea plugin is applied)</li> </ul> <p>relevant parts of the build-script (<a href="https://gitlab.com/tmtron-immutables/encoding/blob/0.0.1/build.gradle" rel="noreferrer">link to the full listing</a>):</p> <pre><code>apply plugin: 'java' apply plugin: "net.ltgt.apt" apply plugin: 'idea' dependencies { def immutablesVersion = '2.3.9' compileOnly "org.immutables:value:$immutablesVersion:annotations" compileOnly "org.immutables:encode:$immutablesVersion" apt "org.immutables:value:$immutablesVersion" } </code></pre> <p>when I start <code>./gradlew build</code> everything is as expected: <a href="https://gitlab.com/tmtron-immutables/encoding/blob/0.0.1/build.gradle" rel="noreferrer"><img src="https://i.stack.imgur.com/Es3n0.png" alt="enter image description here"></a></p> <ol> <li>The source file <code>DataEncoding.java</code> is processed an the generated java-file <code>DataEncodingEnabled.java</code> ends up in</li> <li><code>/build/generated/source/apt/main</code> under the expected package <code>com.tmtron.immutables.data</code></li> <li>and the generated file is also compiled to a .class file</li> </ol> <p>In IntelliJ I activate the annotation processing as suggested by the <a href="https://github.com/tbroyer/gradle-apt-plugin#intellij-idea" rel="noreferrer">gradle-apt-plugin docs</a>: <a href="https://github.com/tbroyer/gradle-apt-plugin#intellij-idea" rel="noreferrer"><img src="https://i.stack.imgur.com/bSSEx.png" alt="enter image description here"></a></p> <p>Then I execute <code>./gradlew clean</code> to make sure, that the previous files are gone and then I click <code>Build</code> - <code>Build Project</code> in IntelliJ.<br> The annotation processor is executed, but the problem is that the generated java file ends up in the wrong location: <a href="https://i.stack.imgur.com/bSSEx.png" rel="noreferrer"><img src="https://i.stack.imgur.com/2sc7u.png" alt="enter image description here"></a></p> <p>It is in: /build/generated/source/apt/main/<strong>build/generated/source/apt/main</strong>/com.tmtron.immutables.data<br> the bold part is redundant. </p> <p>What am I doing wrong and how can I set it up correctly, so that IntelliJ and gradle generate the files in the same directory?</p> <p>Notes: </p> <ul> <li>I have of course already tried to just leave the "Production sources dir" in the IntelliJ annotation configuration empty, but his does not work: then it automatically uses "generated" and I also end up with a wrong path.</li> <li>IntelliJ version 2016.3.4</li> </ul>
0debug
void do_compute_hflags (CPUPPCState *env) { env->hflags = (msr_pr << MSR_PR) | (msr_le << MSR_LE) | (msr_fp << MSR_FP) | (msr_fe0 << MSR_FE0) | (msr_fe1 << MSR_FE1) | (msr_vr << MSR_VR) | (msr_ap << MSR_AP) | (msr_sa << MSR_SA) | (msr_se << MSR_SE) | (msr_be << MSR_BE); #if defined (TARGET_PPC64) env->hflags |= (msr_sf << MSR_SF) | (msr_hv << MSR_HV); #endif }
1threat
How to install Cargo on a Linux server? : I tried installing cargo on RHEL server with: curl https://sh.rustup.rs -sSf | sh but after finishing - got response: cargo -bash: cargo: command not found Is there a different way to install?
0debug
Why fwrite prints spaces while writing binary file in C? : #include <stdio.h> struct my_struct{ char text[50]; } e; int main(){ FILE *file; file = fopen("filename", "ab+"); if(file == NULL){ file = fopen("filename", "wb+"); } printf("Input text: "); fflush(stdin); gets(e.text); fwrite(&e, sizeof(e), 1, file); fclose(file); return 0; } What I'm trying to do here is create a binary file and write the file by text input from the user. The code works fine! The only problem is that the file contains spaces, which I believe is due to the array size of `struct my_structure` that `fwrite` passes while writing the file. I cannot find a good way to remove spaces or replace `fwrite`. Thank you! for answering this question.
0debug
def alternate_elements(list1): result=[] for item in list1[::2]: result.append(item) return result
0debug
static int transcode(AVFormatContext **output_files, int nb_output_files, InputFile *input_files, int nb_input_files, StreamMap *stream_maps, int nb_stream_maps) { int ret = 0, i, j, k, n, nb_ostreams = 0, step; AVFormatContext *is, *os; AVCodecContext *codec, *icodec; OutputStream *ost, **ost_table = NULL; InputStream *ist; char error[1024]; int key; int want_sdp = 1; uint8_t no_packet[MAX_FILES]={0}; int no_packet_count=0; int nb_frame_threshold[AVMEDIA_TYPE_NB]={0}; int nb_streams[AVMEDIA_TYPE_NB]={0}; if (rate_emu) for (i = 0; i < nb_input_streams; i++) input_streams[i].start = av_gettime(); nb_ostreams = 0; for(i=0;i<nb_output_files;i++) { os = output_files[i]; if (!os->nb_streams && !(os->oformat->flags & AVFMT_NOSTREAMS)) { av_dump_format(output_files[i], i, output_files[i]->filename, 1); fprintf(stderr, "Output file #%d does not contain any stream\n", i); ret = AVERROR(EINVAL); goto fail; } nb_ostreams += os->nb_streams; } if (nb_stream_maps > 0 && nb_stream_maps != nb_ostreams) { fprintf(stderr, "Number of stream maps must match number of output streams\n"); ret = AVERROR(EINVAL); goto fail; } for(i=0;i<nb_stream_maps;i++) { int fi = stream_maps[i].file_index; int si = stream_maps[i].stream_index; if (fi < 0 || fi > nb_input_files - 1 || si < 0 || si > input_files[fi].ctx->nb_streams - 1) { fprintf(stderr,"Could not find input stream #%d.%d\n", fi, si); ret = AVERROR(EINVAL); goto fail; } fi = stream_maps[i].sync_file_index; si = stream_maps[i].sync_stream_index; if (fi < 0 || fi > nb_input_files - 1 || si < 0 || si > input_files[fi].ctx->nb_streams - 1) { fprintf(stderr,"Could not find sync stream #%d.%d\n", fi, si); ret = AVERROR(EINVAL); goto fail; } } ost_table = av_mallocz(sizeof(OutputStream *) * nb_ostreams); if (!ost_table) goto fail; for(k=0;k<nb_output_files;k++) { os = output_files[k]; for(i=0;i<os->nb_streams;i++,n++) { nb_streams[os->streams[i]->codec->codec_type]++; } } for(step=1<<30; step; step>>=1){ int found_streams[AVMEDIA_TYPE_NB]={0}; for(j=0; j<AVMEDIA_TYPE_NB; j++) nb_frame_threshold[j] += step; for(j=0; j<nb_input_streams; j++) { int skip=0; ist = &input_streams[j]; if(opt_programid){ int pi,si; AVFormatContext *f= input_files[ ist->file_index ].ctx; skip=1; for(pi=0; pi<f->nb_programs; pi++){ AVProgram *p= f->programs[pi]; if(p->id == opt_programid) for(si=0; si<p->nb_stream_indexes; si++){ if(f->streams[ p->stream_index[si] ] == ist->st) skip=0; } } } if (ist->discard && ist->st->discard != AVDISCARD_ALL && !skip && nb_frame_threshold[ist->st->codec->codec_type] <= ist->st->codec_info_nb_frames){ found_streams[ist->st->codec->codec_type]++; } } for(j=0; j<AVMEDIA_TYPE_NB; j++) if(found_streams[j] < nb_streams[j]) nb_frame_threshold[j] -= step; } n = 0; for(k=0;k<nb_output_files;k++) { os = output_files[k]; for(i=0;i<os->nb_streams;i++,n++) { int found; ost = ost_table[n] = output_streams_for_file[k][i]; if (nb_stream_maps > 0) { ost->source_index = input_files[stream_maps[n].file_index].ist_index + stream_maps[n].stream_index; if (input_streams[ost->source_index].st->codec->codec_type != ost->st->codec->codec_type) { int i= ost->file_index; av_dump_format(output_files[i], i, output_files[i]->filename, 1); fprintf(stderr, "Codec type mismatch for mapping #%d.%d -> #%d.%d\n", stream_maps[n].file_index, stream_maps[n].stream_index, ost->file_index, ost->index); ffmpeg_exit(1); } } else { found = 0; for (j = 0; j < nb_input_streams; j++) { int skip=0; ist = &input_streams[j]; if(opt_programid){ int pi,si; AVFormatContext *f = input_files[ist->file_index].ctx; skip=1; for(pi=0; pi<f->nb_programs; pi++){ AVProgram *p= f->programs[pi]; if(p->id == opt_programid) for(si=0; si<p->nb_stream_indexes; si++){ if(f->streams[ p->stream_index[si] ] == ist->st) skip=0; } } } if (ist->discard && ist->st->discard != AVDISCARD_ALL && !skip && ist->st->codec->codec_type == ost->st->codec->codec_type && nb_frame_threshold[ist->st->codec->codec_type] <= ist->st->codec_info_nb_frames) { ost->source_index = j; found = 1; break; } } if (!found) { if(! opt_programid) { for (j = 0; j < nb_input_streams; j++) { ist = &input_streams[j]; if ( ist->st->codec->codec_type == ost->st->codec->codec_type && ist->st->discard != AVDISCARD_ALL) { ost->source_index = j; found = 1; } } } if (!found) { int i= ost->file_index; av_dump_format(output_files[i], i, output_files[i]->filename, 1); fprintf(stderr, "Could not find input stream matching output stream #%d.%d\n", ost->file_index, ost->index); ffmpeg_exit(1); } } } ist = &input_streams[ost->source_index]; ist->discard = 0; ost->sync_ist = (nb_stream_maps > 0) ? &input_streams[input_files[stream_maps[n].sync_file_index].ist_index + stream_maps[n].sync_stream_index] : ist; } } for(i=0;i<nb_ostreams;i++) { ost = ost_table[i]; os = output_files[ost->file_index]; ist = &input_streams[ost->source_index]; codec = ost->st->codec; icodec = ist->st->codec; if (metadata_streams_autocopy) av_dict_copy(&ost->st->metadata, ist->st->metadata, AV_DICT_DONT_OVERWRITE); ost->st->disposition = ist->st->disposition; codec->bits_per_raw_sample= icodec->bits_per_raw_sample; codec->chroma_sample_location = icodec->chroma_sample_location; if (ost->st->stream_copy) { uint64_t extra_size = (uint64_t)icodec->extradata_size + FF_INPUT_BUFFER_PADDING_SIZE; if (extra_size > INT_MAX) goto fail; codec->codec_id = icodec->codec_id; codec->codec_type = icodec->codec_type; if(!codec->codec_tag){ if( !os->oformat->codec_tag || av_codec_get_id (os->oformat->codec_tag, icodec->codec_tag) == codec->codec_id || av_codec_get_tag(os->oformat->codec_tag, icodec->codec_id) <= 0) codec->codec_tag = icodec->codec_tag; } codec->bit_rate = icodec->bit_rate; codec->rc_max_rate = icodec->rc_max_rate; codec->rc_buffer_size = icodec->rc_buffer_size; codec->extradata= av_mallocz(extra_size); if (!codec->extradata) goto fail; memcpy(codec->extradata, icodec->extradata, icodec->extradata_size); codec->extradata_size= icodec->extradata_size; codec->time_base = ist->st->time_base; if(!strcmp(os->oformat->name, "avi")) { if(!copy_tb && av_q2d(icodec->time_base)*icodec->ticks_per_frame > 2*av_q2d(ist->st->time_base) && av_q2d(ist->st->time_base) < 1.0/500){ codec->time_base = icodec->time_base; codec->time_base.num *= icodec->ticks_per_frame; codec->time_base.den *= 2; } } else if(!(os->oformat->flags & AVFMT_VARIABLE_FPS)) { if(!copy_tb && av_q2d(icodec->time_base)*icodec->ticks_per_frame > av_q2d(ist->st->time_base) && av_q2d(ist->st->time_base) < 1.0/500){ codec->time_base = icodec->time_base; codec->time_base.num *= icodec->ticks_per_frame; } } av_reduce(&codec->time_base.num, &codec->time_base.den, codec->time_base.num, codec->time_base.den, INT_MAX); switch(codec->codec_type) { case AVMEDIA_TYPE_AUDIO: if(audio_volume != 256) { fprintf(stderr,"-acodec copy and -vol are incompatible (frames are not decoded)\n"); ffmpeg_exit(1); } codec->channel_layout = icodec->channel_layout; codec->sample_rate = icodec->sample_rate; codec->channels = icodec->channels; codec->frame_size = icodec->frame_size; codec->audio_service_type = icodec->audio_service_type; codec->block_align= icodec->block_align; if(codec->block_align == 1 && codec->codec_id == CODEC_ID_MP3) codec->block_align= 0; if(codec->codec_id == CODEC_ID_AC3) codec->block_align= 0; break; case AVMEDIA_TYPE_VIDEO: codec->pix_fmt = icodec->pix_fmt; codec->width = icodec->width; codec->height = icodec->height; codec->has_b_frames = icodec->has_b_frames; if (!codec->sample_aspect_ratio.num) { codec->sample_aspect_ratio = ost->st->sample_aspect_ratio = ist->st->sample_aspect_ratio.num ? ist->st->sample_aspect_ratio : ist->st->codec->sample_aspect_ratio.num ? ist->st->codec->sample_aspect_ratio : (AVRational){0, 1}; } break; case AVMEDIA_TYPE_SUBTITLE: codec->width = icodec->width; codec->height = icodec->height; break; case AVMEDIA_TYPE_DATA: break; default: abort(); } } else { if (!ost->enc) ost->enc = avcodec_find_encoder(ost->st->codec->codec_id); switch(codec->codec_type) { case AVMEDIA_TYPE_AUDIO: ost->fifo= av_fifo_alloc(1024); if(!ost->fifo) goto fail; ost->reformat_pair = MAKE_SFMT_PAIR(AV_SAMPLE_FMT_NONE,AV_SAMPLE_FMT_NONE); if (!codec->sample_rate) { codec->sample_rate = icodec->sample_rate; } choose_sample_rate(ost->st, ost->enc); codec->time_base = (AVRational){1, codec->sample_rate}; if (codec->sample_fmt == AV_SAMPLE_FMT_NONE) codec->sample_fmt = icodec->sample_fmt; choose_sample_fmt(ost->st, ost->enc); if (!codec->channels) { codec->channels = icodec->channels; codec->channel_layout = icodec->channel_layout; } if (av_get_channel_layout_nb_channels(codec->channel_layout) != codec->channels) codec->channel_layout = 0; ost->audio_resample = codec->sample_rate != icodec->sample_rate || audio_sync_method > 1; icodec->request_channels = codec->channels; ist->decoding_needed = 1; ost->encoding_needed = 1; ost->resample_sample_fmt = icodec->sample_fmt; ost->resample_sample_rate = icodec->sample_rate; ost->resample_channels = icodec->channels; break; case AVMEDIA_TYPE_VIDEO: if (codec->pix_fmt == PIX_FMT_NONE) codec->pix_fmt = icodec->pix_fmt; choose_pixel_fmt(ost->st, ost->enc); if (ost->st->codec->pix_fmt == PIX_FMT_NONE) { fprintf(stderr, "Video pixel format is unknown, stream cannot be encoded\n"); ffmpeg_exit(1); } if (!codec->width || !codec->height) { codec->width = icodec->width; codec->height = icodec->height; } ost->video_resample = codec->width != icodec->width || codec->height != icodec->height || codec->pix_fmt != icodec->pix_fmt; if (ost->video_resample) { codec->bits_per_raw_sample= frame_bits_per_raw_sample; } ost->resample_height = icodec->height; ost->resample_width = icodec->width; ost->resample_pix_fmt= icodec->pix_fmt; ost->encoding_needed = 1; ist->decoding_needed = 1; if (!ost->frame_rate.num) ost->frame_rate = ist->st->r_frame_rate.num ? ist->st->r_frame_rate : (AVRational){25,1}; if (ost->enc && ost->enc->supported_framerates && !force_fps) { int idx = av_find_nearest_q_idx(ost->frame_rate, ost->enc->supported_framerates); ost->frame_rate = ost->enc->supported_framerates[idx]; } codec->time_base = (AVRational){ost->frame_rate.den, ost->frame_rate.num}; if( av_q2d(codec->time_base) < 0.001 && video_sync_method && (video_sync_method==1 || (video_sync_method<0 && !(os->oformat->flags & AVFMT_VARIABLE_FPS)))){ av_log(os, AV_LOG_WARNING, "Frame rate very high for a muxer not effciciently supporting it.\n" "Please consider specifiying a lower framerate, a different muxer or -vsync 2\n"); } #if CONFIG_AVFILTER if (configure_video_filters(ist, ost)) { fprintf(stderr, "Error opening filters!\n"); exit(1); } #endif break; case AVMEDIA_TYPE_SUBTITLE: ost->encoding_needed = 1; ist->decoding_needed = 1; break; default: abort(); break; } if (ost->encoding_needed && codec->codec_id != CODEC_ID_H264 && (codec->flags & (CODEC_FLAG_PASS1 | CODEC_FLAG_PASS2))) { char logfilename[1024]; FILE *f; snprintf(logfilename, sizeof(logfilename), "%s-%d.log", pass_logfilename_prefix ? pass_logfilename_prefix : DEFAULT_PASS_LOGFILENAME_PREFIX, i); if (codec->flags & CODEC_FLAG_PASS1) { f = fopen(logfilename, "wb"); if (!f) { fprintf(stderr, "Cannot write log file '%s' for pass-1 encoding: %s\n", logfilename, strerror(errno)); ffmpeg_exit(1); } ost->logfile = f; } else { char *logbuffer; size_t logbuffer_size; if (read_file(logfilename, &logbuffer, &logbuffer_size) < 0) { fprintf(stderr, "Error reading log file '%s' for pass-2 encoding\n", logfilename); ffmpeg_exit(1); } codec->stats_in = logbuffer; } } } if(codec->codec_type == AVMEDIA_TYPE_VIDEO){ int size= codec->width * codec->height; bit_buffer_size= FFMAX(bit_buffer_size, 6*size + 1664); } } if (!bit_buffer) bit_buffer = av_malloc(bit_buffer_size); if (!bit_buffer) { fprintf(stderr, "Cannot allocate %d bytes output buffer\n", bit_buffer_size); ret = AVERROR(ENOMEM); goto fail; } for(i=0;i<nb_ostreams;i++) { ost = ost_table[i]; if (ost->encoding_needed) { AVCodec *codec = ost->enc; AVCodecContext *dec = input_streams[ost->source_index].st->codec; if (!codec) { snprintf(error, sizeof(error), "Encoder (codec id %d) not found for output stream #%d.%d", ost->st->codec->codec_id, ost->file_index, ost->index); ret = AVERROR(EINVAL); goto dump_format; } if (dec->subtitle_header) { ost->st->codec->subtitle_header = av_malloc(dec->subtitle_header_size); if (!ost->st->codec->subtitle_header) { ret = AVERROR(ENOMEM); goto dump_format; } memcpy(ost->st->codec->subtitle_header, dec->subtitle_header, dec->subtitle_header_size); ost->st->codec->subtitle_header_size = dec->subtitle_header_size; } if (avcodec_open2(ost->st->codec, codec, &ost->opts) < 0) { snprintf(error, sizeof(error), "Error while opening encoder for output stream #%d.%d - maybe incorrect parameters such as bit_rate, rate, width or height", ost->file_index, ost->index); ret = AVERROR(EINVAL); goto dump_format; } assert_codec_experimental(ost->st->codec, 1); assert_avoptions(ost->opts); if (ost->st->codec->bit_rate && ost->st->codec->bit_rate < 1000) av_log(NULL, AV_LOG_WARNING, "The bitrate parameter is set too low." "It takes bits/s as argument, not kbits/s\n"); extra_size += ost->st->codec->extradata_size; } } for (i = 0; i < nb_input_streams; i++) { ist = &input_streams[i]; if (ist->decoding_needed) { AVCodec *codec = ist->dec; if (!codec) codec = avcodec_find_decoder(ist->st->codec->codec_id); if (!codec) { snprintf(error, sizeof(error), "Decoder (codec id %d) not found for input stream #%d.%d", ist->st->codec->codec_id, ist->file_index, ist->st->index); ret = AVERROR(EINVAL); goto dump_format; } if (avcodec_open2(ist->st->codec, codec, &ist->opts) < 0) { snprintf(error, sizeof(error), "Error while opening decoder for input stream #%d.%d", ist->file_index, ist->st->index); ret = AVERROR(EINVAL); goto dump_format; } assert_codec_experimental(ist->st->codec, 0); assert_avoptions(ost->opts); } } for (i = 0; i < nb_input_streams; i++) { AVStream *st; ist = &input_streams[i]; st= ist->st; ist->pts = st->avg_frame_rate.num ? - st->codec->has_b_frames*AV_TIME_BASE / av_q2d(st->avg_frame_rate) : 0; ist->next_pts = AV_NOPTS_VALUE; ist->is_start = 1; } for (i=0;i<nb_meta_data_maps;i++) { AVFormatContext *files[2]; AVDictionary **meta[2]; int j; #define METADATA_CHECK_INDEX(index, nb_elems, desc)\ if ((index) < 0 || (index) >= (nb_elems)) {\ snprintf(error, sizeof(error), "Invalid %s index %d while processing metadata maps\n",\ (desc), (index));\ ret = AVERROR(EINVAL);\ goto dump_format;\ } int out_file_index = meta_data_maps[i][0].file; int in_file_index = meta_data_maps[i][1].file; if (in_file_index < 0 || out_file_index < 0) continue; METADATA_CHECK_INDEX(out_file_index, nb_output_files, "output file") METADATA_CHECK_INDEX(in_file_index, nb_input_files, "input file") files[0] = output_files[out_file_index]; files[1] = input_files[in_file_index].ctx; for (j = 0; j < 2; j++) { MetadataMap *map = &meta_data_maps[i][j]; switch (map->type) { case 'g': meta[j] = &files[j]->metadata; break; case 's': METADATA_CHECK_INDEX(map->index, files[j]->nb_streams, "stream") meta[j] = &files[j]->streams[map->index]->metadata; break; case 'c': METADATA_CHECK_INDEX(map->index, files[j]->nb_chapters, "chapter") meta[j] = &files[j]->chapters[map->index]->metadata; break; case 'p': METADATA_CHECK_INDEX(map->index, files[j]->nb_programs, "program") meta[j] = &files[j]->programs[map->index]->metadata; break; } } av_dict_copy(meta[0], *meta[1], AV_DICT_DONT_OVERWRITE); } if (metadata_global_autocopy) { for (i = 0; i < nb_output_files; i++) av_dict_copy(&output_files[i]->metadata, input_files[0].ctx->metadata, AV_DICT_DONT_OVERWRITE); } for (i = 0; i < nb_chapter_maps; i++) { int infile = chapter_maps[i].in_file; int outfile = chapter_maps[i].out_file; if (infile < 0 || outfile < 0) continue; if (infile >= nb_input_files) { snprintf(error, sizeof(error), "Invalid input file index %d in chapter mapping.\n", infile); ret = AVERROR(EINVAL); goto dump_format; } if (outfile >= nb_output_files) { snprintf(error, sizeof(error), "Invalid output file index %d in chapter mapping.\n",outfile); ret = AVERROR(EINVAL); goto dump_format; } copy_chapters(infile, outfile); } if (!nb_chapter_maps) for (i = 0; i < nb_input_files; i++) { if (!input_files[i].ctx->nb_chapters) continue; for (j = 0; j < nb_output_files; j++) if ((ret = copy_chapters(i, j)) < 0) goto dump_format; break; } for(i=0;i<nb_output_files;i++) { os = output_files[i]; if (avformat_write_header(os, &output_opts[i]) < 0) { snprintf(error, sizeof(error), "Could not write header for output file #%d (incorrect codec parameters ?)", i); ret = AVERROR(EINVAL); goto dump_format; } if (strcmp(output_files[i]->oformat->name, "rtp")) { want_sdp = 0; } } dump_format: for(i=0;i<nb_output_files;i++) { av_dump_format(output_files[i], i, output_files[i]->filename, 1); } if (verbose >= 0) { fprintf(stderr, "Stream mapping:\n"); for(i=0;i<nb_ostreams;i++) { ost = ost_table[i]; fprintf(stderr, " Stream #%d.%d -> #%d.%d", input_streams[ost->source_index].file_index, input_streams[ost->source_index].st->index, ost->file_index, ost->index); if (ost->sync_ist != &input_streams[ost->source_index]) fprintf(stderr, " [sync #%d.%d]", ost->sync_ist->file_index, ost->sync_ist->st->index); fprintf(stderr, "\n"); } } if (ret) { fprintf(stderr, "%s\n", error); goto fail; } if (want_sdp) { print_sdp(output_files, nb_output_files); } if (!using_stdin) { if(verbose >= 0) fprintf(stderr, "Press [q] to stop, [?] for help\n"); avio_set_interrupt_cb(decode_interrupt_cb); } term_init(); timer_start = av_gettime(); for(; received_sigterm == 0;) { int file_index, ist_index; AVPacket pkt; double ipts_min; double opts_min; redo: ipts_min= 1e100; opts_min= 1e100; if (!using_stdin) { if (q_pressed) break; key = read_key(); if (key == 'q') break; if (key == '+') verbose++; if (key == '-') verbose--; if (key == 's') qp_hist ^= 1; if (key == 'h'){ if (do_hex_dump){ do_hex_dump = do_pkt_dump = 0; } else if(do_pkt_dump){ do_hex_dump = 1; } else do_pkt_dump = 1; av_log_set_level(AV_LOG_DEBUG); } if (key == 'd' || key == 'D'){ int debug=0; if(key == 'D') { debug = input_streams[0].st->codec->debug<<1; if(!debug) debug = 1; while(debug & (FF_DEBUG_DCT_COEFF|FF_DEBUG_VIS_QP|FF_DEBUG_VIS_MB_TYPE)) debug += debug; }else scanf("%d", &debug); for(i=0;i<nb_input_streams;i++) { input_streams[i].st->codec->debug = debug; } for(i=0;i<nb_ostreams;i++) { ost = ost_table[i]; ost->st->codec->debug = debug; } if(debug) av_log_set_level(AV_LOG_DEBUG); fprintf(stderr,"debug=%d\n", debug); } if (key == '?'){ fprintf(stderr, "key function\n" "? show this help\n" "+ increase verbosity\n" "- decrease verbosity\n" "D cycle through available debug modes\n" "h dump packets/hex press to cycle through the 3 states\n" "q quit\n" "s Show QP histogram\n" ); } } file_index = -1; for(i=0;i<nb_ostreams;i++) { double ipts, opts; ost = ost_table[i]; os = output_files[ost->file_index]; ist = &input_streams[ost->source_index]; if(ist->is_past_recording_time || no_packet[ist->file_index]) continue; opts = ost->st->pts.val * av_q2d(ost->st->time_base); ipts = (double)ist->pts; if (!input_files[ist->file_index].eof_reached){ if(ipts < ipts_min) { ipts_min = ipts; if(input_sync ) file_index = ist->file_index; } if(opts < opts_min) { opts_min = opts; if(!input_sync) file_index = ist->file_index; } } if(ost->frame_number >= max_frames[ost->st->codec->codec_type]){ file_index= -1; break; } } if (file_index < 0) { if(no_packet_count){ no_packet_count=0; memset(no_packet, 0, sizeof(no_packet)); usleep(10000); continue; } break; } if (limit_filesize != 0 && limit_filesize <= avio_tell(output_files[0]->pb)) break; is = input_files[file_index].ctx; ret= av_read_frame(is, &pkt); if(ret == AVERROR(EAGAIN)){ no_packet[file_index]=1; no_packet_count++; continue; } if (ret < 0) { input_files[file_index].eof_reached = 1; if (opt_shortest) break; else continue; } no_packet_count=0; memset(no_packet, 0, sizeof(no_packet)); if (do_pkt_dump) { av_pkt_dump_log2(NULL, AV_LOG_DEBUG, &pkt, do_hex_dump, is->streams[pkt.stream_index]); } if (pkt.stream_index >= input_files[file_index].ctx->nb_streams) goto discard_packet; ist_index = input_files[file_index].ist_index + pkt.stream_index; ist = &input_streams[ist_index]; if (ist->discard) goto discard_packet; if (pkt.dts != AV_NOPTS_VALUE) pkt.dts += av_rescale_q(input_files[ist->file_index].ts_offset, AV_TIME_BASE_Q, ist->st->time_base); if (pkt.pts != AV_NOPTS_VALUE) pkt.pts += av_rescale_q(input_files[ist->file_index].ts_offset, AV_TIME_BASE_Q, ist->st->time_base); if (ist->ts_scale) { if(pkt.pts != AV_NOPTS_VALUE) pkt.pts *= ist->ts_scale; if(pkt.dts != AV_NOPTS_VALUE) pkt.dts *= ist->ts_scale; } if (pkt.dts != AV_NOPTS_VALUE && ist->next_pts != AV_NOPTS_VALUE && (is->iformat->flags & AVFMT_TS_DISCONT)) { int64_t pkt_dts= av_rescale_q(pkt.dts, ist->st->time_base, AV_TIME_BASE_Q); int64_t delta= pkt_dts - ist->next_pts; if((FFABS(delta) > 1LL*dts_delta_threshold*AV_TIME_BASE || pkt_dts+1<ist->pts)&& !copy_ts){ input_files[ist->file_index].ts_offset -= delta; if (verbose > 2) fprintf(stderr, "timestamp discontinuity %"PRId64", new offset= %"PRId64"\n", delta, input_files[ist->file_index].ts_offset); pkt.dts-= av_rescale_q(delta, AV_TIME_BASE_Q, ist->st->time_base); if(pkt.pts != AV_NOPTS_VALUE) pkt.pts-= av_rescale_q(delta, AV_TIME_BASE_Q, ist->st->time_base); } } if (recording_time != INT64_MAX && (pkt.pts != AV_NOPTS_VALUE ? av_compare_ts(pkt.pts, ist->st->time_base, recording_time + start_time, (AVRational){1, 1000000}) : av_compare_ts(ist->pts, AV_TIME_BASE_Q, recording_time + start_time, (AVRational){1, 1000000}) )>= 0) { ist->is_past_recording_time = 1; goto discard_packet; } if (output_packet(ist, ist_index, ost_table, nb_ostreams, &pkt) < 0) { if (verbose >= 0) fprintf(stderr, "Error while decoding stream #%d.%d\n", ist->file_index, ist->st->index); if (exit_on_error) ffmpeg_exit(1); av_free_packet(&pkt); goto redo; } discard_packet: av_free_packet(&pkt); print_report(output_files, ost_table, nb_ostreams, 0); } for (i = 0; i < nb_input_streams; i++) { ist = &input_streams[i]; if (ist->decoding_needed) { output_packet(ist, i, ost_table, nb_ostreams, NULL); } } term_exit(); for(i=0;i<nb_output_files;i++) { os = output_files[i]; av_write_trailer(os); } print_report(output_files, ost_table, nb_ostreams, 1); for(i=0;i<nb_ostreams;i++) { ost = ost_table[i]; if (ost->encoding_needed) { av_freep(&ost->st->codec->stats_in); avcodec_close(ost->st->codec); } #if CONFIG_AVFILTER avfilter_graph_free(&ost->graph); #endif } for (i = 0; i < nb_input_streams; i++) { ist = &input_streams[i]; if (ist->decoding_needed) { avcodec_close(ist->st->codec); } } ret = 0; fail: av_freep(&bit_buffer); if (ost_table) { for(i=0;i<nb_ostreams;i++) { ost = ost_table[i]; if (ost) { if (ost->st->stream_copy) av_freep(&ost->st->codec->extradata); if (ost->logfile) { fclose(ost->logfile); ost->logfile = NULL; } av_fifo_free(ost->fifo); av_freep(&ost->st->codec->subtitle_header); av_free(ost->resample_frame.data[0]); av_free(ost->forced_kf_pts); if (ost->video_resample) sws_freeContext(ost->img_resample_ctx); if (ost->resample) audio_resample_close(ost->resample); if (ost->reformat_ctx) av_audio_convert_free(ost->reformat_ctx); av_dict_free(&ost->opts); av_free(ost); } } av_free(ost_table); } return ret; }
1threat
void dpy_gfx_replace_surface(QemuConsole *con, DisplaySurface *surface) { DisplayState *s = con->ds; DisplaySurface *old_surface = con->surface; DisplayChangeListener *dcl; con->surface = surface; QLIST_FOREACH(dcl, &s->listeners, next) { if (con != (dcl->con ? dcl->con : active_console)) { continue; } if (dcl->ops->dpy_gfx_switch) { dcl->ops->dpy_gfx_switch(dcl, surface); } } qemu_free_displaysurface(old_surface); }
1threat
8086 Assembly - get UNIX timestamp(seconds) : How can I get the current UNIX time on 8086 asm? I couldn't find any information on the internet. I know it's 32bit so it would have to fill 2 registers.
0debug
Get Azure DevOps server (on-premise) data using Power Bi desktop connector : I want to show analytics using power bi desktop from my azure devops server which is deployed on premise but unable to connect while the error shows analytics extension is not available for on premise server of devops. I also found the Odata endpoint method but is it possible to do it using connector? Please suggest.
0debug
WPF MVVM Registration Login : <p>I'm new to "wpf". I'm making some demo application. Have multiple windows inside Views Folder(Employee, Items...). I want to have navigation menu in the Main Window and on click to display the other windows - BUT with MVVM pattern. I've watched some tutorials about PRISM regions, but still I cannot understand - <strong>is it possible to put other **Windows</strong>(from the Views folder, not User Controls) inside Main window?** and if it is can someone point me to the right resources? Thank You.</p>
0debug
No definition for GetLength() in int[][] How do I fix this? : <p>I'm trying to solve the diagonal difference <a href="https://www.hackerrank.com/challenges/diagonal-difference/problem" rel="nofollow noreferrer">problem</a> but it keeps on giving me this problem:</p> <blockquote> <p>Type <code>int[][]' does not contain a definition for</code>GetLenght' and no extension method <code>GetLenght' of type</code>int[][]' could be found.</p> </blockquote> <pre><code>static int diagonalDifference(int[][] arr) { int result = 0; int result2 = 0; for (int i = 0; i &lt; arr.GetLenght(0); i++) { result =+ arr[i][i]; } for (int i = arr.GetLength(0); i &gt; 0; i--) { result2 =+ arr[i][i]; } return Math.Abs(result+result2); } </code></pre> <p>This are my references: </p> <pre><code>using System.CodeDom.Compiler; using System.Collections.Generic; using System.Collections; using System.ComponentModel; using System.Diagnostics.CodeAnalysis; using System.Globalization; using System.IO; using System.Linq; using System.Reflection; using System.Runtime.Serialization; using System.Text.RegularExpressions; using System.Text; using static System.Array; using System; </code></pre>
0debug
int float32_eq_signaling( float32 a, float32 b STATUS_PARAM ) { if ( ( ( extractFloat32Exp( a ) == 0xFF ) && extractFloat32Frac( a ) ) || ( ( extractFloat32Exp( b ) == 0xFF ) && extractFloat32Frac( b ) ) ) { float_raise( float_flag_invalid STATUS_VAR); return 0; } return ( a == b ) || ( (bits32) ( ( a | b )<<1 ) == 0 ); }
1threat
document.write('<script src="evil.js"></script>');
1threat
QemuOpts *qemu_opts_find(QemuOptsList *list, const char *id) { QemuOpts *opts; TAILQ_FOREACH(opts, &list->head, next) { if (!opts->id) { continue; } if (strcmp(opts->id, id) != 0) { continue; } return opts; } return NULL; }
1threat
static void close_peer_eventfds(IVShmemState *s, int posn) { int i, n; if (!ivshmem_has_feature(s, IVSHMEM_IOEVENTFD)) { return; } if (posn < 0 || posn >= s->nb_peers) { error_report("invalid peer %d", posn); return; } n = s->peers[posn].nb_eventfds; memory_region_transaction_begin(); for (i = 0; i < n; i++) { ivshmem_del_eventfd(s, posn, i); } memory_region_transaction_commit(); for (i = 0; i < n; i++) { event_notifier_cleanup(&s->peers[posn].eventfds[i]); } g_free(s->peers[posn].eventfds); s->peers[posn].nb_eventfds = 0; }
1threat
How bind Android DataBinding to Menu? : <p>As it supports Data Binding menu in android? I write this code, but error: "Error:(16, 26) No resource type specified (at 'visible' with value '@{item.visible}')."</p> <pre><code>&lt;menu xmlns:android="http://schemas.android.com/apk/res/android" xmlns:app="http://schemas.android.com/apk/res-auto"&gt; &lt;data&gt; &lt;variable name="item" type="ru.dixy.ubiworkerchecklistsmobile.Models.Fact"/&gt; &lt;import type="android.view.View"/&gt; &lt;/data&gt; &lt;item android:id="@+id/compliteitem" android:title="mybutton" android:icon="@drawable/complite" android:visible="@{item.visible}" app:showAsAction="ifRoom" /&gt; &lt;/menu&gt; </code></pre>
0debug
How can I get the length of a javaFX audio clip in millis : <p>Hi everyone I want to play a JavaFX audio clip and sleep the time that it takes this audio clip to finish. I can't find any method in the JavaFX AudioClip</p>
0debug
Diagrams symbols in UML , OOAD : <p>I have seen many symbols, notations related to UML and OOA&amp;D. Most of the times those symbols don't have any labels so I am not able to understand what they are. For example, we have symbol for <code>Generalization</code> , <code>Realization</code>, <code>Uses</code> etc.</p> <p>Is there any book , resource which depicts commonly used symbols and their meaning?</p>
0debug
How to test styled Material-UI components wrapped in withStyles using react-testing-library? : <p>I am trying to create a test with a styled Material-UI component using react-testing-library in typescript. I'm finding it difficult to access the internal functions of the component to mock and assert. </p> <p>Form.tsx</p> <pre><code>export const styles = ({ palette, spacing }: Theme) =&gt; createStyles({ root: { flexGrow: 1, }, paper: { padding: spacing.unit * 2, margin: spacing.unit * 2, textAlign: 'center', color: palette.text.secondary, }, button: { margin: spacing.unit * 2, } }); interface Props extends WithStyles&lt;typeof styles&gt; { }; export class ExampleForm extends Component&lt;Props, State&gt; { async handleSubmit(event: React.FormEvent&lt;HTMLFormElement&gt;) { // Handle form Submit ... if (errors) { window.alert('Some Error occurred'); return; } } // render the form } export default withStyles(styles)(ExampleForm); </code></pre> <p>Test.tsx</p> <pre><code>import FormWithStyles from './Form'; it('alerts on submit click', async () =&gt; { jest.spyOn(window,'alert').mockImplementation(()=&gt;{}); const spy = jest.spyOn(ActivityCreateStyles,'handleSubmit'); const { getByText, getByTestId } = render(&lt;FormWithStyles /&gt;) fireEvent.click(getByText('Submit')); expect(spy).toHaveBeenCalledTimes(1); expect(window.alert).toHaveBeenCalledTimes(1); }) </code></pre> <p><code>jest.spyOn</code> throws the following error <code>Argument of type '"handleSubmit"' is not assignable to parameter of type 'never'.ts(2345)</code> probably because ExampleForm in wrapped in withStyles. </p> <p>I also tried directly importing the <code>ExampleForm</code> component and manually assigning the styles, was couldn't do so:</p> <pre><code>import {ExampleForm, styles} from './Form'; it('alerts on submit click', async () =&gt; { ... const { getByText, getByTestId } = render(&lt;ActivityCreateForm classes={styles({palette,spacing})} /&gt;) ... } </code></pre> <p>Got the following error: <code>Type '{ palette: any; spacing: any; }' is missing the following properties from type 'Theme': shape, breakpoints, direction, mixins, and 4 more.ts(2345)</code></p> <p>I'm finding it difficult to write basic tests in Typescript for <code>Material-UI</code> components with <code>react-testing-library</code> &amp; <code>Jest</code> due to strong typings and wrapped components. Please Guide.</p>
0debug
Expiry date alert 30 days before expiry : <p>Is there a way for me, with a WPF application, to schedule an alert message to be shown 30 days before expiry date? What I have is a ValidFromDate, ValidToDate of user subscription.</p>
0debug
int av_parse_cpu_flags(const char *s) { #define CPUFLAG_MMXEXT (AV_CPU_FLAG_MMX | AV_CPU_FLAG_MMXEXT | AV_CPU_FLAG_CMOV) #define CPUFLAG_3DNOW (AV_CPU_FLAG_3DNOW | AV_CPU_FLAG_MMX) #define CPUFLAG_3DNOWEXT (AV_CPU_FLAG_3DNOWEXT | CPUFLAG_3DNOW) #define CPUFLAG_SSE (AV_CPU_FLAG_SSE | CPUFLAG_MMXEXT) #define CPUFLAG_SSE2 (AV_CPU_FLAG_SSE2 | CPUFLAG_SSE) #define CPUFLAG_SSE2SLOW (AV_CPU_FLAG_SSE2SLOW | CPUFLAG_SSE2) #define CPUFLAG_SSE3 (AV_CPU_FLAG_SSE3 | CPUFLAG_SSE2) #define CPUFLAG_SSE3SLOW (AV_CPU_FLAG_SSE3SLOW | CPUFLAG_SSE3) #define CPUFLAG_SSSE3 (AV_CPU_FLAG_SSSE3 | CPUFLAG_SSE3) #define CPUFLAG_SSE4 (AV_CPU_FLAG_SSE4 | CPUFLAG_SSSE3) #define CPUFLAG_SSE42 (AV_CPU_FLAG_SSE42 | CPUFLAG_SSE4) #define CPUFLAG_AVX (AV_CPU_FLAG_AVX | CPUFLAG_SSE42) #define CPUFLAG_AVXSLOW (AV_CPU_FLAG_AVXSLOW | CPUFLAG_AVX) #define CPUFLAG_XOP (AV_CPU_FLAG_XOP | CPUFLAG_AVX) #define CPUFLAG_FMA3 (AV_CPU_FLAG_FMA3 | CPUFLAG_AVX) #define CPUFLAG_FMA4 (AV_CPU_FLAG_FMA4 | CPUFLAG_AVX) #define CPUFLAG_AVX2 (AV_CPU_FLAG_AVX2 | CPUFLAG_AVX) #define CPUFLAG_BMI2 (AV_CPU_FLAG_BMI2 | AV_CPU_FLAG_BMI1) static const AVOption cpuflags_opts[] = { { "flags" , NULL, 0, AV_OPT_TYPE_FLAGS, { .i64 = 0 }, INT64_MIN, INT64_MAX, .unit = "flags" }, #if ARCH_PPC { "altivec" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_CPU_FLAG_ALTIVEC }, .unit = "flags" }, #elif ARCH_X86 { "mmx" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_CPU_FLAG_MMX }, .unit = "flags" }, { "mmxext" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = CPUFLAG_MMXEXT }, .unit = "flags" }, { "sse" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = CPUFLAG_SSE }, .unit = "flags" }, { "sse2" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = CPUFLAG_SSE2 }, .unit = "flags" }, { "sse2slow", NULL, 0, AV_OPT_TYPE_CONST, { .i64 = CPUFLAG_SSE2SLOW }, .unit = "flags" }, { "sse3" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = CPUFLAG_SSE3 }, .unit = "flags" }, { "sse3slow", NULL, 0, AV_OPT_TYPE_CONST, { .i64 = CPUFLAG_SSE3SLOW }, .unit = "flags" }, { "ssse3" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = CPUFLAG_SSSE3 }, .unit = "flags" }, { "atom" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_CPU_FLAG_ATOM }, .unit = "flags" }, { "sse4.1" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = CPUFLAG_SSE4 }, .unit = "flags" }, { "sse4.2" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = CPUFLAG_SSE42 }, .unit = "flags" }, { "avx" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = CPUFLAG_AVX }, .unit = "flags" }, { "avxslow" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = CPUFLAG_AVXSLOW }, .unit = "flags" }, { "xop" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = CPUFLAG_XOP }, .unit = "flags" }, { "fma3" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = CPUFLAG_FMA3 }, .unit = "flags" }, { "fma4" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = CPUFLAG_FMA4 }, .unit = "flags" }, { "avx2" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = CPUFLAG_AVX2 }, .unit = "flags" }, { "bmi1" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_CPU_FLAG_BMI1 }, .unit = "flags" }, { "bmi2" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = CPUFLAG_BMI2 }, .unit = "flags" }, { "3dnow" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = CPUFLAG_3DNOW }, .unit = "flags" }, { "3dnowext", NULL, 0, AV_OPT_TYPE_CONST, { .i64 = CPUFLAG_3DNOWEXT }, .unit = "flags" }, { "cmov", NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_CPU_FLAG_CMOV }, .unit = "flags" }, #elif ARCH_ARM { "armv5te", NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_CPU_FLAG_ARMV5TE }, .unit = "flags" }, { "armv6", NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_CPU_FLAG_ARMV6 }, .unit = "flags" }, { "armv6t2", NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_CPU_FLAG_ARMV6T2 }, .unit = "flags" }, { "vfp", NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_CPU_FLAG_VFP }, .unit = "flags" }, { "vfpv3", NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_CPU_FLAG_VFPV3 }, .unit = "flags" }, { "neon", NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_CPU_FLAG_NEON }, .unit = "flags" }, #elif ARCH_AARCH64 { "armv8", NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_CPU_FLAG_ARMV8 }, .unit = "flags" }, { "neon", NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_CPU_FLAG_NEON }, .unit = "flags" }, { "vfp", NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_CPU_FLAG_VFP }, .unit = "flags" }, #endif { NULL }, }; static const AVClass class = { .class_name = "cpuflags", .item_name = av_default_item_name, .option = cpuflags_opts, .version = LIBAVUTIL_VERSION_INT, }; int flags = 0, ret; const AVClass *pclass = &class; if ((ret = av_opt_eval_flags(&pclass, &cpuflags_opts[0], s, &flags)) < 0) return ret; return flags & INT_MAX; }
1threat
static int gif_parse_next_image(GifState *s) { for (;;) { int code = bytestream_get_byte(&s->bytestream); #ifdef DEBUG dprintf(s->avctx, "gif: code=%02x '%c'\n", code, code); #endif switch (code) { case ',': if (gif_read_image(s) < 0) return -1; return 0; case ';': return -1; case '!': if (gif_read_extension(s) < 0) return -1; break; default: return -1; } } }
1threat
how to convert video to gif image with audio in android using ffmpeg? : <p>I want convert video which upload from gallery or camera into GIF image with audio using FFMPEG andorid.Please give me proper solution for create gif image with sound.Please help me Thanks in adavace</p>
0debug
static int output_frame(AVFilterLink *outlink, int nb_samples) { AVFilterContext *ctx = outlink->src; MixContext *s = ctx->priv; AVFilterBufferRef *out_buf, *in_buf; int i; calculate_scales(s, nb_samples); out_buf = ff_get_audio_buffer(outlink, AV_PERM_WRITE, nb_samples); if (!out_buf) return AVERROR(ENOMEM); in_buf = ff_get_audio_buffer(outlink, AV_PERM_WRITE, nb_samples); if (!in_buf) return AVERROR(ENOMEM); for (i = 0; i < s->nb_inputs; i++) { if (s->input_state[i] == INPUT_ON) { int planes, plane_size, p; av_audio_fifo_read(s->fifos[i], (void **)in_buf->extended_data, nb_samples); planes = s->planar ? s->nb_channels : 1; plane_size = nb_samples * (s->planar ? 1 : s->nb_channels); plane_size = FFALIGN(plane_size, 16); for (p = 0; p < planes; p++) { s->fdsp.vector_fmac_scalar((float *)out_buf->extended_data[p], (float *) in_buf->extended_data[p], s->input_scale[i], plane_size); } } } avfilter_unref_buffer(in_buf); out_buf->pts = s->next_pts; if (s->next_pts != AV_NOPTS_VALUE) s->next_pts += nb_samples; return ff_filter_samples(outlink, out_buf); }
1threat
Android Studio: product flavor combination with more than two flavor dimensions (flavor groups) : <p>I am developing an Android application using Android Studio (v 2.1, gradle plugin v 2.1.0). My application has various versions which share a lot of common code so I decided to use flavor dimensions and product flavors to customize code and resources when and where it is requested. This worked fined as long as I only had two flavor dimensions. As an example, my <code>app.gradle</code> was</p> <pre><code>… flavorDimensions "fruit", "color" productFlavors { apple { dimension "fruit" } pear { dimension "fruit" } red { dimension "color" } yellow { dimension "color" } } … </code></pre> <p>and my <code>src</code> folder was</p> <pre><code>src/ appleRed/ appleYellow/ pearRed/ pearYellow/ </code></pre> <p>each one with a custom version of my code. Again, as an example</p> <pre><code>src/ appleRed/java/com/example/ExampleFragment.java appleYellow/java/com/example/ExampleFragment.java pearRed/java/com/example/ExampleFragment.java pearYellow/java/com/example/ExampleFragment.java </code></pre> <p>of course, there is no instance of <code>ExampleFragment</code> in <code>src/main</code>.</p> <p>At some point during development, I had to include a <em>free</em> and a <em>paid</em> version of the app. I thought that it could be easily achieved by adding a new flavor dimension named <code>version</code> and two product flavors named <code>free</code> and <code>paid</code>:</p> <pre><code> … flavorDimensions "fruit", "color”, “version” productFlavors { apple { dimension "fruit" } pear { dimension "fruit" } red { dimension "color" } yellow { dimension "color" } free { dimension "version" } paid { dimension “version” } } … </code></pre> <p>but all of a sudden the custom code generated by the combination of <code>fruit</code> and <code>color</code> was not detected by Android Studio anymore. So no <code>appleRed</code>, <code>appleYellow</code>, <code>pearRed</code> nor <code>pearYellow</code> can be used to have custom code and the only way I was able to regain my configuration was to use all the combinations of all the three flavour dimensions:</p> <pre><code> src/ appleRedFree/java/com/example/ExampleFragment.java appleRedPaid/java/com/example/ExampleFragment.java appleYellowFree/java/com/example/ExampleFragment.java appleYellowPaid/java/com/example/ExampleFragment.java pearRedFree/java/com/example/ExampleFragment.java pearRedPaid/java/com/example/ExampleFragment.java pearYellowFree/java/com/example/ExampleFragment.java pearYellowPaid/java/com/example/ExampleFragment.java </code></pre> <p>This is not good because <code>ExampleFragment</code> is duplicated across the same <code>fruitColor*</code> combination (<code>appleRedFree</code>, <code>appleRedPaid</code> have the same <code>ExampleFragment</code>). Same problem happens for resources (the ones in <code>res</code> folder).</p> <p>My questions are: </p> <p>1) Is this the expected behaviour from gradle in Android Studio (<em>i.e.</em>, not being able to combine a subset of product flavors, following their priority based on their dimension, when having more than two flavour dimensions)?</p> <p>2) Given the fact that this is the expected behaviour, is there another way I can achieve my customisation without duplicated code or without having a single file with an <em>if-statement</em> inside (<em>e.g.</em>, <code>if (BuildConfig.FLAVOR_version == "free") ...</code>) ? </p> <p>Please note that I’m talking about having custom code which could be complex, so I’m not asking for basic customisation like a build config variable, variant filtering, or something like that.</p>
0debug
static void pcie_aer_msg(PCIDevice *dev, const PCIEAERMsg *msg) { uint8_t type; while (dev) { if (!pci_is_express(dev)) { return; } type = pcie_cap_get_type(dev); if ((type == PCI_EXP_TYPE_ROOT_PORT || type == PCI_EXP_TYPE_UPSTREAM || type == PCI_EXP_TYPE_DOWNSTREAM) && !pcie_aer_msg_vbridge(dev, msg)) { return; } if (!pcie_aer_msg_alldev(dev, msg)) { return; } if (type == PCI_EXP_TYPE_ROOT_PORT) { pcie_aer_msg_root_port(dev, msg); return; } dev = pci_bridge_get_device(dev->bus); } }
1threat
int coroutine_fn qemu_co_sendv(int sockfd, struct iovec *iov, int len, int iov_offset) { int total = 0; int ret; while (len) { ret = qemu_sendv(sockfd, iov, len, iov_offset + total); if (ret < 0) { if (errno == EAGAIN) { qemu_coroutine_yield(); continue; } if (total == 0) { total = -1; } break; } total += ret, len -= ret; } return total; }
1threat